/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


Have a nice day, Anon!


Selecting a Programming Language Robowaifu Technician 09/11/2019 (Wed) 13:07:45 No.128
What programming language would suit us and our waifus best? For those of us with limited experience programming, it's a daunting question.
Would a language with a rigid structure be best?
Do we want an object-oriented language?
How much do you care about wether or not a given language is commonly used and widespread?
What the fuck does all that terminology mean?
Is LISP just a meme, or will it save us all?

In this thread, we will discuss these questions and more so those of us who aren't already settled into a language can find our way.
>>128
I'm hardly an unbiased developer, having worked in C++ professionally on and off for 10 years now. I consider it the world's most flexible language, and it's probably also the single most important language today for AI--an area obviously of much importance to /robowaifu/. I've used a few other languages in my career and hobbies, but C++ is certainly my favorite one. While not the most ideal language for a beginner to start with, it can be done, and if you begin with it then you will both understand the 'machine' very well, you also won't believe in 'magic' the way Python and other languages will feed you.
>How much do you care about wether or not a given language is commonly used and widespread?
Well, it certainly has an impact on the widespread availability of developers out there, and thus the likelihood of them connecting with us who might express interest in developing robowaifus. There are approaching 5 million professional programmers in C++ today, and every single one of them care about performance. I personally consider this a vital differentiator for the potential for success in devising our own robowaifus, but again, I'm hardly an unbiased source.
>>12

Python is probably the single language with the most promise for a beginner IMO. It's widely used as a general language and scripting tool for admins. It's used all over the sciences now, especially in combination with the R platform. It's the scripting API for Tensorflow (the single most important Deep Learning framework today). The downside is that if you ever decide to go deeper into programming and actually understand how everything works, you'll have to spend quite a while unlearning the belief in 'magic' that Python will inevitably force on the beginner.
https ://www.python.org/

Scala is probably a very good choice for a beginner as well. Kind of a mashup of several languages, it lays a great foundation for functional programming in general.
https ://scala-lang.org/
I have minimal experience programming. I've dabbled with Python, HTML, CSS, and know very little beyond the very basics.
That being said, I'm not afriad to go in any direction I see fit, and I work well under a steep learning curve if the topic at hand interests me.
With that in mind, no languages are off the table as far as I'm concerned. Only problem is I have basically no fucking idea about what language I should really use.
If it helps narrow things down, I'm not concerned with the hardware aspect of things at the moment. I'm just concerned about making the personality, and I'm looking to have her be an emergent system. The more heavy lifting I can offload to the computer, the better. I also want her scope to be easily expandable.
For instance: let's say I've already succeeded in getting her to be fairly competent, and I want to see if I can get her to play a game to test her capability for logical reasoning. This hypothetical game is a game that I didn't program myself. I'll want to build that bridge as easily as possible.

On a final note, both logical reasoning and abstraction tend to come naturally to me, but usually from weird angles.

>>130
Kek, you posted literally right as I was finishing typing all this shit.
>>131
>Kek, you posted literally right as I was finishing typing all this shit.
Heh, synchronicity.

TBH, you're dealing with a question I think every guy who wants to learn programming for real faces. I can probably give you some ideas about the theoretical basics, and general architecture questions. But truth be told, I think like a C++ developer, having done it now for all these years. I dove into it intentionally, and ultimately because I wanted to create both AI and robots. But even if I hadn't had that as a long-term goal in life, I still consider it to have been one of the better choices I've ever made. It's paid many bills, and there are lots of well-paying jobs out there for talented C++ devs.

AMA, but in the end you'll still be faced with the original conundrum anon: You still have to decide. This will probably means months (years?) of serious research and trial and error. You've stated that logical thinking comes easy to you, so that should ease the process for you.
>>132
I guess the two biggest questions I have for you right now are:
1 - In regards to my previously posted concerns about expandibility, how does this play into the equation? Is C++ a good language for that, is it something that can be done with just about any language, or am I looking for something that doesn't really exist?
2 - I should have mentioned this earlier, but both of my servers use archaic architectures (Alpha64). If at all possible, I want to incorporate those into my designs. Essentially this would mean having them mostly do behind-the-scenes number crunching, and little else. Of course, I also have a modern desktop which is considerably more powerful, so it's not a matter of life or death, but the inclusion of those servers is ideal.
>>133
tbh, you'd have to very explicitly and specifically define your meaning of the word 'expandability'. To me the word naturally connotes 'flexibility' and as I've already stated ITT, IMO C++ is literally the most flexible language in the world. Several other languages have been literally written using C++ underneath, to give you some idea.

However, 'flexibility' definitely doesn't equate to 'developer productivity' which I suspect is actually closer to your underlying definition of the word. As with other areas of engineering, everything is a tradeoff. Other languages may make you more productive at the cost of flexibility. In other words, if you can accomplish exactly what you want and achieve all the engineering requirements in a particular language, then you're ahead--as long as you don't need to change that scope later.

This kind of knowledge honestly takes years to acquire.

BTW, iirc correctly GCC supports Alpha architecture, so probably any front end language it supports will be OK with it's compiler. C++, for example. YMMV, so you'll have to confirm this specifically. I'm unable to. If you can a 'hello world', then you should be good. EG, file named main.cpp :
#include <iostream>
int main() {
std::cout << "Hello World\n";
}


compile & run with:
g++ main.cpp && ./a

If you see
Hello World

then you're golden.
>>134
hmm. looks like newlines don't render inside code tags. :/
Open file (296.33 KB 500x700 1414990391770.png)
>>134
Certainly food for thought.
By "expandability" I mean: Taking a program I wrote, and making it do something that wasn't accounted for in the original code.
As an example, let's take my general plan for developing the presently hypothetical AI.
I will probably just have her start out as a really basic chatbot that can read large datasets. But later on, I will want her to be able to interface with other programs. This may include: playing a game, crawling the web for more data, maybe even communicating with different AIs that other anons programmed.
I might also want her to have a graphical interface later on. I will even want her to be able to make her own programs at some point.
One thing I will certainly want to do over time is optimize and adjust the way she parses data, and at some point have her be able to do that herself.

Of course, I won't be jumping straight into all that shit from the get-go like a maniac. Version 1.0 is going to be so basic that she won't even be able to count to potato. But I will want a language that makes the process of adding random shit like that as easy as possible.
>>136
Heh, I was following you right up until the very last sentence.
>But I will want a language that makes the process of adding random shit like that as easy as possible.
You were describing again and again the need for incredible flexibility, and C++ certainly fits the bill. Then at the last second you jumped all the way over to the other end of the spectrum and made ease of use the priority. C++ definitely doesn't fit that bill heh.

See what I mean? Everything in engineering is a tradeoff. Using well-defined approaches to software engineering can reduce dependencies on specific language features somewhat, but has it's limits. The language choice for a project often dictates the limits of what can actually be accomplished, practically speaking.

I'm not trying to overcomplicate things, honestly. Real-world engineering is already complicated af all by itself.

My instinct is you should try Scala anon. It will give you a great learning platform, both flexible and powerful, yet pretty easy to use as well. If in the end you find you need even more power and you will, i'll bet then you'll have a smooth transition to an even more flexible and powerful platform like C++ .

BTW, I may be 'logging off' before long, but I'll be back and answer anything else later on.
>>136
Also BTW
Nano best gril
Open file (107.70 KB 1280x720 1462206904485.jpeg)
>>137
When I said "easy", I wasn't really thinking my words through. I apologize.
What I really meant was that I want a language that's built to do the kind of shit I described, as opposed to something super rigid or limited that I'd have to use fucky workarounds with.

Also, I briefly spoke with one of my professors earlier today about it (though not in so much detail).
His recommendations were: C++, R, or Clojure. He also mentioned another language I can't remember the name of that's supposed to be C++ on steroids or something.
>>142
I see. Well, there's certainly no magic 'make robowaifu' button to press anon. As with Sir Isaac, I'd recommend you stand on other's shoulders the way we all do. Find an opensauce project that's sort of close to what you imagine, then start digging. Build and run the system as is. Start tinkering around with it. With any luck in a few weeks or months you'll have a good idea what makes it tick and then you can start over from scratch and use your own ideas instead.

There simply is no replacement at this time for human ingenuity in the software engineering field. It's a very creative process after all.

>R
Not sure how that would serve very well in a general sense for our endeavors anon. I could be wrong, but it's not something I would recommend you pursue atp.
>Clojure
Wouldn't be a bad choice, if rather esoteric. Scala has most of the good things about Clojure and is far more mainstream, as opposed to academic.

>'C++ on steroids'
Hmm, can't say I've ever heard of that. C++ literally is pure steroids, almost by definition.
Open file (188.41 KB 500x347 IMG_8040.GIF)
>>145
I really wish we had more than just three people here, it's hard to get fired up in a ghost town
I already have a somewhat abstract plan of attack. Study a language, make a bunch of tiny programs and fuck around with features as I do so, study existing examples of AI, make note of interesting things, repeat ad nauseam until I get to the point where I feel like I can write something basic that is only 99% terrible, and build up from there. I have no delusions of this being any less impossibly difficult and time consuming than I'm imagining.
At this point, as far as I'm concerned, it's a choice between C++ or Clojure. I'll sleep on it and make my choice in the morning. Whatever I choose, I'll shell out some shekels for a few books and just get into it.
If I don't force myself in a direction while there's still a fire under my ass, then it'll be a long-ass time before that fire comes again.
>>146
>I really wish we had more than just three people here, it's hard to get fired up in a ghost town
There's the Meta thread for discussing such topics anon. I personally need just the internal vision in my heart of literally millions men dispossessed by the evils of feminism to give me the motivation I need. For me it's literally a spiritual issue.
>>146
>repeat ad nauseam until I get to the point where I feel like I can write something basic that is only 99% terrible, and build up from there.
Heh, sounds like a plan. :^)

Yeah, it's important in life to move while you have momentum anon. Whatever you decide on, I'll try to help you to the degree I can.
>>147
>>148
The three people possibly 4, I don't know if you are including me. are loyal men to their brothers in arms. I am glad I found a small wholesome community like this where we can freely share ideas between each other without the fear of losing our jobs or being grounded by our parents. (^: I mean this in the most literal sense, I love this community. Also when you guys get time, I wanna speak about something concerning in the meta thread.

Sage for off topic
>>146
I plan to set up my box with Clojure and follow along with you anon, so hopefully the guys get code tags working properly here soon.
>>161
>that pic
lol nice
>>175
I got it from /b2/ on the final days of 8chan before it was shut down. It's a little scary to think about. I guess it's kind of like reading a German news paper from early 1945...
I'm back, and I decided to go with C++.
Clojure was certainly tempting, being a LISP dialect with full Java support and a growing corporate userbase. In the end though I coudn't deny the far superior wealth of support and documentation that already exists for C++.
Bought a guide for learning it, and a book about algorithms because I have a feeling I'll need that down the line.
>>190
OK, sounds good. AMA I should be able to help you with C++ and using it well. May or may not be able to help with original algorithm designs. OFC the C++ bread already has important links not the least of which is getting familiar with the book list maintained over on stackoverflow. Welcome aboard the coder train anon. May we have robowaifus in our hearts and homes soon!

BTW, you'll need to figure out a good editor that you can be comfortable with soon.
I hope your C++ studies work out for you anon.
>>130
This is good advice. I'd modify it slightly to say that a true novice should cut their teeth on Python (I hear that Python The Hard Way is a good primer) and then add some C on embedded systems - preferably something real-time - before touching C++. Python, "magic" and all, has the main advantage of laying down fundamental patterns of thought, and time-sensitive embedded C gets you close to the machine so that Python's abstractions won't damage long-term development. All learning consists of telling lies to small children; the trick is knowing when to take the lies away and replace them with the next, slightly less untrue set. In any case don't learn with JavaScript and definitely don't touch PHP until you have several better languages under your belt. I've observed that people who start with either of those two languages seem to suffer some kind of brain damage that carries over insufferably into other languages.

Clojure is also a decent choice as a first functional language. Functional experience pays dividends in any language. I hear that "Clojure for the Brave and True" is a good primer.

Don't neglect your tradecraft. Learn version control early. Understand how to collaborate with your future and past selves. Learn what good, disciplined thought feels and smells like and what sloppy thought feels and smells like. The Jargon File is a decent source of tips on developing your mindset, although some of it must be understood in its historical context.
>>428
>All learning consists of telling lies to small children; the trick is knowing when to take the lies away and replace them with the next, slightly less untrue set.
heh, creative way to word that anon. Great advice overall. What can I say, I love C++. I make no bones about it tbh. I learned C first, and wish I hadn't. If I'd had Ch27 of Stroustrup's PPP2 back then I would have both understood C better, and would have much more quickly transitioned to C++ thereafter.

Anyway, thanks for the advice and the tips. That kind of post helps out here a lot.
>>428
>The Jargon File
Thanks anon.
>>134
Could you please tell me the browser and OS you made this post on? Thanks.
>>539
Chromium derivative & Ubuntu.
>>428
It's funny, but I keep rethinking back to your words anon, especially the "Understand how to collaborate with your future and past selves." bit. That idea is helping to guide me in how to choose better function and variable naming, how to refactor code out to stay focused in the 'inner core' of a program, and generally how to do better architecture of my stuff. Thanks for the tip anon!
>>1271
I am glad that the words helped you, but don’t thank me, Anon - thank Master Foo: http://www.catb.org/~esr/writings/unix-koans/prodigy.html

>“Moving in accordance with the law of nature, it unfolds inexorably in the minds of programmers, assimilating designs to its own nature. All software that would compete with it must become like to it; empty, empty, profoundly empty, perfectly void, hail!”
Open file (21.65 KB 265x400 106728.jpg)
>>1272
very nice, here's one for you anon (though regrettably you'd probably have to get the 'made-from-atoms' version to enjoy it.

www.goodreads.com/book/show/106728.The_Timeless_Way_of_Building
Some lisp and AI related books:
>Structure and Interpretation of Computer Programs (SICP)
Book: https://mitpress.mit.edu/sites/default/files/sicp/full-text/book/book.html
Lectures: https://www.youtube.com/watch?v=-J_xL4IGhJA&list=PLE18841CABEA24090&index=1

>Paradigms of Artificial Intelligence Programming
Book and code: https://github.com/norvig/paip-lisp

>Practical Common Lisp
Book and code: http://www.gigamonkeys.com/book/

>The Common Lisp Cookbook
Book: https://lispcookbook.github.io/cl-cookbook/

>Clojure For The Brave and True
Book: https://www.braveclojure.com/clojure-for-the-brave-and-true/

>>128
>Is LISP just a meme, or will it save us all?
Lisp is good for learning (Scheme and SICP) and it allows one to build working programs fast (Clojure and Common Lisp), in addition, lisp has been traditionally used in AI programming.

>How much do you care about wether or not a given language is commonly used and widespread?
It's important that the language...
1) is tested and that it actually works
2) has a community, so you can get libraries, tools, learning resources and help
lisp (Common Lisp and Clojure), java and C++ are examples of languages that fulfill these requirements.

>What the fuck does all that terminology mean?
Object-oriented programming (OOP) just means that the language support one particular way of programming. nowadays, OOP is most widely used in the industry but functional programming is gaining a lot attention lately.
>>1322
Thanks for the links, Lisper-Anon.
This is an open-ended question so feel free to respond with your own opinion or thoughts.

As I understand it, compilers are continuing to improve to the point where even very savvy coders have a hard time beating them. So if a higher level programming language still compiles to fast code, then what are some advantages of using a lower level language in the context of AI and robotics?
>>1515
Well, I'm no expert in compiler design, and for the men who are, the phrase 'higher level programming language' applies to C and basically everything else outside of assembler or actual machine code.

You're making what is a mistake atp of just lumping everything into black and white. 'Very savvy coders' can probably implement a 'better' generic programming implementation than say, C++ template meta-programming by hand-writing assembler, but that would come with a few trade-offs;
-1. It would only be for a specific example, whereas the high-level meta-programming approach is fully generic.
-2. It would require months of the developers concentrated effort to produce a more performant version than say, GCC right out of the box.
-3. It would only work on a specific type of CPU, for example Intel or ARM.
-4. It would be a bitch to maintain that code after the fact (IMO).

So yea higher-level languages like C++ are certainly 'better' than low-level languages in that case.

Further, I'll presume the question is related to the common perception of 'higher-level' and please define that explicitly w/o using 'well, kinda like Haskell, you know' heh. While there are literally millions of professionals programming in C++ daily for a living, most amateurs consider it some a small niche thing (and therefore automatically both difficult and not worth learning).

Python certainly has more popularity and is considered higher-level. Examples of it's use abound in the sciences. But you certainly wouldn't write a 120fps FPS using it. In engineering, everything is a trade-off, and there's no free lunch--only elegant and inelegant approaches.

I use Python and C++ specifically because of OP's topic ITT: What programming language would suit us and our [robo]waifus best?

TensorFlow is arguably the single most import AI ML/DL framework right now. There's rather a good user API for the tool using Python. But the engine itself isn't written in Python, but rather in C++ (there's also an API in that language too ofc). TF performs metric shittons of intensive math operations and you definitely wouldn't be too happy with the performance if those were done in Python instead of C++. Yet calling into the C++ code w/ Python is a great idea.

Each language has it's place and there are definitely still quite wide performance gaps between very mature, highly-optimizable languages like C, C++, and FORTRAN, and modern coffee languages like Python, JavaScript, and C#, when performing math-intensive operations like performing tensor calculus on billion-cell matrices.
I don't think it's a good idea to advice beginners today to learn C++ as a start, bc its seems to be hard to work with. Python is what many scientists and beginners use. We'll need to glue different existing programs together, we won't write them completely on our own. For that, Python is the way to go. Speed only matters in some cases, btw.
>>4338 I understand your point Anon, but honestly, C++ isn't that hard to learn if you focus on the proper essentials. The problem is there's such yuge boatload of terrible, terrible educational material on C++ out there. It's really given guys the wrong impression of the language imo. I hope to correct some of that with the TDD. >Speed only matters in some cases, btw. Maybe in a general-purpose app or program. For practically the entire spectrum of software engineering related to robowaifus, speed is king. We literally will not be able to solve this domain successfully without well-written C & C++ programming. But yes, once the core engines are written and debugged, then adding in Python scripting on top of that will be suitable ofc.
>>4358 >I hope to correct some of that with the RDD.* derp. >>3001
>>4358 The engines for everything I can think of are general purpose, other people are using those and are working on improvements. Could you give me any example where we need fast software which isn't used outside of this project? Btw, there are ways to make Python code faster if necessary, same for many Lisp variants. However, I have no intenion to start one of these famous fights about programming languages here. I never tried CPP really, had no reason so far.
Open file (683.26 KB 1511x1997 fig-runtime-arch.jpg)
>>4376 >Could you give me any example where we need fast software Two words, Anon: Hard Real-time OK maybe that's 3 words heh There is literally nothing that is software controlled inside a safety-critical system (such as a Gynoid Robot expected to be around naive humans like children, for example) that isn't absolutely dependent on (billions of) calculations being done not only properly, but within a hard-limit time budget. I can go on about this for a while, but to save some time, do a little research of your own on those two words first Anon, and then come back and ask further ITT.
>>128 Good day fellow robo-lover! In reality, if we consider waifus to be both hardware and software, then there will be several groups of languages used: -Low-level languages such as architecture-specific assembly, C, maybe (unlikely) C++, which will be needed for embedded controllers with limited RAM and processing power. -Languages that will be running within an OS environment for machine-learning, bigger data proccessing etc. Those could be C++, Python, Go...you name it XD Coming from an electronics background, I think the topology of the waifu bot should be: -Simple, minimum driver code for interfacing to the hardware which by itself can execute a set of actions/steps -Higher level system for managing the overall state of the bot by aggregating data via APIs -I'm not fond of IoT, so I'd prefer the network connectivity to either not be present, or disconnected from the hardware control. I'll always recommend starting with C, just because it forces you to study the underlying architecture. However you should learn from a guide/book (like Ken and Ritchie's C) as learning it on your own is more frustrating without enough background. If you want to focus on higher level, I'm not sure what to offer. I use Python, but as mentioned earlier, it hides a lot from the beginner.
Open file (15.96 KB 260x343 knr_c_2e.jpg)
>>4403 >However you should learn from a guide/book (like [Kernighan] and Ritchie's C) Here's the bookcover. You literally cannot be involved in the C community w/o running into this.
>>4405 Hehe It's a well written book (though I've only read the first two chapters and usually use the others for reference). Also, perhaps you chaps might point me to the right thread, as this a little off-topic. If there's no thread I could create one. I wonder if anyone considered using alternative CPU architectures (i.e. not x86/amd64 or Arm) to make sure the waifu bot runs on an open architecture where the documentation can be readily acquired. For me this is important from a security perspective (Intel ME...) because a waifu bot is a very personal object and may be able to acquire a lot of info about you. I was once hyped about RISCV until I heard some open source projects struggling to get the necessary changes added to the standard (Libre-SoC). OpenPOWER might be interesting as it is well established.
>>4406 We are very concerned about the botnet issues here, both from security and privacy perspectives. We don't have a specified thread, but it is touched on in our SBC & microcontrollers thread. >>16 Please feel free to start an entirely new thread on the topic Anon if you feel qualified. Since this is such a technical topic, if you wouldn't mind make it an effort-post OP, detailing both the hazards of commodity processors, maybe something about the efforts in the alternative communities so far. Plenty of links always help round out a good OP ofc. Thank you.
>>4407 Sure, I'm quite busy on weekdays, but I'll see if I can do a little write-up for the weekend. I'm glad I found this place, as for once I can have a serious conversation about implementing waifuswith current technology, not just wishing and hoping! Thanks!
>>4406 BTW, this might be a good post to introduce to newcomers. >>2701
>>4408 >Thanks! You're welcome. Glad to have you here.
>related xpost >>6712
Here's something that might be useful. It seems to be picking up a lot of adherents. Nim https://nim-lang.org/ It's supposed to be easy to use like python but has a lot of advanced features without so much complication. Some features. It has macros which can make programs rewrite themselves according to situations and is supposed to raise productivity if you know how to use them. Macros are a big deal in LISP and C++. It can compile itself or to C or javascript. It has more than one level of garbage protection and memory protection so you can dash off stuff to test then advance speed by allocating memory yourself if what you wrote works well. Seems like something easier to use than C++ and LISP but more useful than python without being so big.
>>8524 Python is very common in science and education, though. It also has a lot of libraries, and speed isn't always a matter. However, I wanted to try out NIM for a while now. If it becomes popular here, I'm quite sure I will do so.
One thing is that if we're going to have different languages interacting then the language that will make this happen is C. Additionally, robowaifus are cutting edge tech, so we won't be able to cut ourselves some slack with convenient off the shelf hardware and operating systems, we'll have to deal with purpose-built hardware and software which inevitably comes with odd behaviors and optimizations, and to create and make use of those knowing the theory behind assembly is required, so that when the hardware pops up an assembly programmer will already know all about memory models and vector instructions and whatnot and be able to pick up programming for this new hardware with just an instruction set listing. Why are soundcards cheap? Because inside them is a DSP with a native word size of 24 bits and which can only read and manipulate data in that one size, it also has weird instructions that optimize reading an array in an unusual order common in DSPs, not a full-blown ARM core that's made with many more millions of transistors providing features a soundcard will never use. Similarly, the average router until recently had a MIPS CPU with no floating point unit, floats aren't relevant for router tasks. At this point the whole range of programming languages is required somewhere. Straight up machine code, compiled languages, and scripting languages. Focus on theory, general experience, and topics related to robowaifu engineering. Whatever you know will likely be of use somewhere.
>>8608 >At this point the whole range of programming languages is required somewhere. This Anon gets it. We're going to be having all sorts of hardware too. Probably broadly divisible into mobile (ie, onboard the robowaifu) and immobile. >mobile: At least three types and probably more: -edge sensors/actuators. Neuromorphics wants to both combine these together and push large numbers of them out to the edges of the robowaifu. -microcontrollers. Shepherds after a fashion directing all those edge devices, and reporting up and down the chain with the SBCs. -sbcs. The mobile 'brains' that will manage most of the heuristic-type work for the systems. Ofc, they will stay in touch at least intermittently with the immobile resources. >immobile: At least 3 types of systems, probably more: -servers for data, mostly big storage devices. -servers for AI training, lots of GPU horsepower. -gateway systems for securing and defending the rest. On top of that is typical networking, etc. already common in anon's homes. >tl;dr We'll all need a lot of different types of software in the end. So, just make a choice and dive in.
Open file (566.46 KB 1912x1186 graphit-perf_comparison.png)
I'm probably going to look into this at some time: GraphIt - a domain specified language for graphs. https://graphit-lang.org/ https://youtu.be/ptIVf-YlkhY It outputs C++, but optimizes the code based on search algorithms for graphs. Or something like that, lol.
My two cents being an delivering Waifu Engine. Which the renderer is built with C# using unity, then the AI core is built with Python. With Python learn the basics, then learn about what duck typing is vs every other language type system. Then learn how to work with a team by building a domain language around how you communicate. Worst case scenario you end up with a job, but the best thing about Python is there are many resources if you get stuck. A lot of languages like CPP and Lisp or Haskell there are few resources. I know this because I came from a Haskell background, and used it to parallize data workflows using DAGs. You want the language that will have the lowest barrier of entry, any other language will discourage you as a learner. Theres a lot to learn in programming, though the mental models transfer over to other languages, you just need to learn the details and nuances.
>>10495 Thanks very much Anon, I'll look into it! >>10497 >You want the language that will have the lowest barrier of entry, any other language will discourage you as a learner. Theres a lot to learn in programming, though the mental models transfer over to other languages, you just need to learn the details and nuances. I absolutely agree with this notion for a true beginner. But compiled languages like C & C++ bring a critical robowaifu engineering requirement to the table. Namely, efficiency & performance. Because these two compiled languages output machine code that is actually quite close to mirroring the actual underlying hardware (at least when well-written), they don't unduly introduce artificial energy-consumption and wall-clock hits. And particularly for physical robowaifus where Energy is everything, keeping battery drain low is absolutely vital. C & C++ both handle this like a boss.
Common Lisp would be the best option, however you'll need at least some level of proficiency with it to actually use it effectively, which is why something simpler would be better.
Fast.ai will use Swift for Tensorflow in a few years, and already uses it in some paces under the hood. For now they advice using fast.ai and Pytorch to beginners. https://youtu.be/XHyASP49ses >>10553 I have experience in some Lisp, and will go on using it and learn more dialects. However, it always depends on the job, because of libraries and such.
>>128 Probably a more advanced set of programming than what we already have with c++ and machine code with manufacturing machines to make the waifu have perception and be able to not try to punch a hole in a wall trying to reach for the remote.
>>12329 So, have any productive suggestions Anon, or just plying dystopic imagery here?
>>12335 You clearly don’t anon
>>12329 this isn't resolved via language but mechanical strength stepping via complex algorithm tl;dr there's a reason we as biological organisms have adrenaline and super strength when we are under its effects. In a clam relaxed state we are "Weaker" but this is why we don't constantly injure ourselves or break tools and utensils. This is also why we have superior fine motor coordination and dexterity relative to other primates (who have superior strength and power). Our R/Ws are going to need to have different types of actuators for soft touch versus when power is needed (to lift, walk, jump, etc). Otherwise a missed line of code would result in a hole in the wall or worse a sensitive part getting pinched or ending up seriously injured. Simply put, when a R/W is in intimate mode, turn off hydraulic/strong pneumatic actuators and only run e/m servos, solenoids (with resisting springs, counterweights, etc) and nitrile fiber. When R/W has the "all clear" to do outdoor type activity, resume full power. Just one example. Anyway probably belongs in the actuator thread >>12131 or another thread but I've been sitting on this concept for a while and wanted to type it out before it's lost.
>>12361 It's a good point that we'll need multi-level forces within our robowaifus for them to work properly for us Anon
A good, actually very good alternative would be Forth https://en.wikipedia.org/wiki/Forth_(programming_language) Forth was designed when computers had very little memory or power. It's very concise and is productive in an environment where speed and close to the hardware work is needed. It's very small and fast. One of the big benefits is it is made up of words that can be combined. So once you write a routine you can use the word to preform and action. It's used for all kinds of task that demand productivity, speed and are based on limited one of kind type programming jobs or very specific hardware. Look it up I think you will be impressed. There is also a version for ESP32 microcontrollers which I think look to be some of the most powerful best cost versatile microcontrollers built today.
>>12859 Interesting. Is it low-level enough though? We'd need direct access to hardware resources via pointers to do many to the practical things we'd need to to build software for a robowaifu. I know C & C++ offer this, and I'm pretty sure that Ada and D do as well. There are probably others, but tbh there are pretty solid reasons that C & C++ absolutely dominate the systems programming world. And it's not b/c politics or power really, but that other p-word; Pointers.
>>12951 >Interesting. Is it low-level enough though? Yes. Forth used to be, not sure now, THE language that motherboard manufacturers used to boot the computer and set up all the hardware for the OS. It was originally invented by this guy to run a big science telescope. When micro-processors had limited memory and power Forth was used a lot because it's small and very versatile.
>>12951 >Pointers "...A compiled Forth program is a collection of words, each of which contains a statically allocated list of pointers to other words..." "...Forth programmers traditionally value complete understanding and control over the machine and their programming environment. Therefore, what Forth compilers don't do reveals something about the language and its use. Type checking, macro preprocessing, common subexpression elimination, and other traditional compiler services are feasible, but usually not included in Forth compilers. This simplicity allows Forth development systems to be small enough to fit in the on-chip ROM of an 8-bit microcontroller. On the other hand, Forth's extensibility allows "full-featured" systems to consume over 100K bytes and provide comprehensive window-based programming environments. Forth also allows (and often encourages) programmers to completely understand the entire compiler and run-time system. Forth supports extremely flexible and productive application development while making ultimate control of both the language and hardware easily attainable. ..." http://users.ece.cmu.edu/~koopman/forth/hopl.html Since most waifus will revolve around passing messages and reading positions you don't need any large OS. So forth is likely good for waifus.
>>12981 Wow sounds like a pretty compelling position Anon. I'll plan to make time at some point to have a go with it. Thanks.
Here's a link to the history of forth and should give a better idea of it's strengths and weaknesses. https://www.forth.com/resources/forth-programming-language/
One more forth link. It's microForth which is written in C so it's portable. If you search for microForth you get manuals. https://github.com/Earlz/microforth This is just one there ar emany Forths because it's so small people write their own.
>>12986 Thanks Anon I'll look into it.
>>13025 To be fair I need to point out some disadvantages. The way it's constructed you have to think about what you are doing. Modern languages have a ton of programs written already so a lot of it is just stringing together already written code. Forth you will write a lot of it yourself. Since it's a little different it might be harder to read and figure what's going on. My perception is that Forth is kind of like LISP, not that it's any way LISP but since it's a little different and takes some thought to get things going people have a harder time with it. Meaning that it doesn;t grow. It's the same as comparing C++, C, Python and then HTML you have less people able to make good use of these in turn because the level of thought is a little higher in each case with HTML being much easier to grasp than C++. To end on a high note building waifus is not likely to have a lot of ready to burn code for it so Forth is a fast way to prototype and build working devices. The other languages that are easy to use like Python and Java will be too slow and C is too cryptic and liable to errors frequently.
>>13027 "Meaning that it doesn;t grow." Strike that out. I have no idea even what I was trying to say there but that's not right. It happens sometimes you start a sentence thinking about one thing, stop then when you continue you don't tie it together then miss correcting it when proof reading.
>>13028 Heh, no worries Anon it's fine. We all do that. :^)
Open file (329.04 KB 753x1060 1633245348813.png)
I've been a C/C++ programmer for 5 years. While the languages are excellent at low-level abstraction work, they come with a number of unignorable safety issues. A language that is designed to be safe from the ground up, can do everything C can do, and just as fast, is Rust. Rust is not easy to learn, but it has an already large and growing community with many resources.
Just want to throw this in here as a reminder to myself, that I might have to learn Haskell: Swish is a framework, written in the purely functional programming language Haskell, for performing deductions in RDF data using a variety of techniques. Swish is conceived as a toolkit for experimenting with RDF inference, and for implementing stand-alone RDF file processors (usable in similar style to CWM, but with a view to being extensible in declarative style through added Haskell function and data value declarations). It explores Haskell as "a scripting language for the Semantic Web". Swish is a work-in-progress, and currently incorporates: Turtle, Notation3 and NTriples input and output. The N3 support is incomplete (no handling of @forAll). RDF graph isomorphism testing and merging. Display of differences between RDF graphs. Inference operations in forward chaining, backward chaining and proof-checking modes. Simple Horn-style rule implementations, extendable through variable binding modifiers and filters. Class restriction rule implementation, primarily for datatype inferences. RDF formal semantics entailment rule implementation. Complete, ready-to-run, command-line and script-driven programs. Though, the development has currently stalled: https://hackage.haskell.org/package/swish
>>13847 Did you compile in --release mode?
Open file (491.00 KB 600x745 terry ex.png)
C and Scheme. C for power and speed, generally minding the machine as low level languages do. Scheme for simplicity, ease of use, and speed of writing the code, as well as convenience while ignoring the machine's details. Not that Scheme is slow or C is hard, but that is where one beats the other. These are well designed languages with standards, high quality implementations, and which are widely supported. They're also the languages used in many good books that could be used for self teaching. Forget jobs, forget what people use, forget what companies use, forget what has the frameworks for webshitting, forget everything else but this question: which are some good programming languages? C and Scheme tower far above all else.
>>13850 >look at this ruffle code That's a fast and well optimized DoS resistant hash function (1). While it's not the same, you can find similar assembly with a sip hash in C at -03 (2, hashmap_sip). Ruffle is slow because it's a proof of concept and not feature complete, it says so in the readme (3). I'm not sure why you have a problem with stack pointers, is it panic unwinding the call stack? You can disable that for embedded programming (4). If you would like to learn panic-free Rust, I found a neat guide that was written for you (5). (1) https://github.com/tkaitchuck/aHash/blob/master/compare/readme.md#Speed (2) https://github.com/tidwall/hashmap.c (3) https://github.com/ruffle-rs/ruffle#project-status (4) https://docs.rust-embedded.org/embedonomicon/smallest-no-std.html (5) https://docs.opentitan.org/doc/ug/rust_for_c/
>>13877 I'm another anon. I barely know anything about that stuff. I wonder what you think of other languages compiling to C. I recall some version of Scheme doing that and I also Dylan: http://www.cs.cmu.edu/~gwydion/dylan-release/docs/htdocs/d2c.html I always wondered how good the code of such languages is and how hard it would be to update their behavior to create better code.
Open file (13.56 MB 640x360 Rusticle.mp4)
>>13883 why would you compile to c? you wouldnt get any performance benefit since the problem is the way the code is written, heavily abstracted languages just dont translate well compared to c, you cant just rip out the abstraction unless you were coding without it in the first place, just look at c--, its basically c yet c-- is almost always slower than c, only because idiomatic c-- is stupidly more abstract and the retard oop mentality promotes an inefficient way to write code, translating those abstractions like oop into procedural code ( which is the only real way a computer works ) is inevitably going to produce inefficiency and bloat you can still optimize with c in another language though, most languages have ways to call external c functions precisely for optimization, just look up "c call in (language)" people usually just write performance critical code in c, compile to an object file or shared library, and call it from from inside the program, python scripts do this all the time because of how slow python is you could probably go further with the optimization, I know c and c-- have inline assembly too, so you can optimize on the hardware, I dont know every language but I assume other languages would have this too or at least a way to link a custom object file during compilation or with a shared library
>>13884 >you wouldnt get any performance benefit since the problem is the way the code is written What?!? Benefit compared to what? Do you honestly believe compiling some high level language code into C doesn't make the resulting code faster? If so, then I'd like to see some sources. >you cant just rip out the abstraction unless you were coding without it in the first place You lack fantasy and I'm sure this is wrong.
>>13887 Why not use Java?
>>13887 do you not know how a compiler works, all compilers just compile to assembly, theres nothing special about the c compiler its only better at optimizing only because c code is way more explicit and deterministic with the least abstraction from assembly, your giving the compiler so much information to work with that theres little guessing for the compiler, thats all code is, youre just instructing the compiler what you want to do and it figures out the assembly for it and optimizes it, the c compiler just demands more for less, higher languages do the opposite more abstraction, more implicity and less code, its easier to write but the compiler has more guesswork to fill in the blanks and figure out what youre trying to do compilers for high level languages are already specialized around optimizing implicities and the guesswork required, trying to compile an abstract language to c is just as difficult as trying to compile it to assembly, your not going to outperform the regular compiler unless you actually rewrite abstract code to be less abstract and less implicit, the only way to do that is to just use a more expressive language
Open file (1.58 MB 1162x2069 Mads.jpg)
>>13900 You're comparing coding in C to compiling from some language into C. That wasn't the question. Let me in peace with your nonsense. No one will write everything in C. That's is even dumber than writing everything in C++. This might be a sabotage attempt and if so, it will not succeed. We generally need no flameware on languages here. This thread should be about languages and their use case. Righ tool for the job and so on. For most of us it's going to be Python and maybe some variant of Lisp and specialized languages for specific use cases. I know that programs written C, C++ and other languages can be embedded into high level code or being called by other programs, which is exactly why I won't write in any of this till I need it. Then there might be languages which compile down to C, machine code or bytecode, whatever. Case closed, goodbye
>>13906 Yeah it's enough, I wrote that already. Im even not going to read this. The question in >>13883 is still open, but you're not willing or capable to answer it. And I don't need it answered here. You're just picking fights fot trolling. Should be moved to the basement later.
Open file (66.20 KB 1200x1200 GraphQL_Logo.svg.png)
GraphQL is a query lanuage, it might be interesting for creating APIs between different internal databases. It seems to be popular with the web crowd and it's open source. https://youtu.be/8l7TxqWI1XA - It helps clients to only aks for specific data and get that, especially useful if data comes from different sources.
Open file (18.83 KB 528x163 Grakn-logo.png)
>>13923 The other language I'd like to bring to your attention is Grakn: https://en.m.wikipedia.org/wiki/GRAKN.AI >Grakn is an open-source, distributed knowledge graph database for knowledge-oriented systems. It is an evolution of the relational database for highly interconnected data as it provides a concept-level schema that fully implements the Entity-Relationship (ER) model. However, Grakn’s schema is a type system that implements the principles of knowledge representation and reasoning. This enables Grakn's declarative query language, Graql (Grakn’s reasoning and analytics query language), to provide a more expressive modelling language and the ability to perform deductive reasoning over large amounts of complex data. Effectively, Grakn is a knowledge base for artificial intelligence and cognitive computing systems.
>>10565 >Fast AI, Python, Swift >>13539 >Rust >>13826 >Haskell >>13883 >Scheme, other Lisps, Dylan >>13923 >GraphQL >>13926 >Grakn Ignore the trolling. Stay focused. Use the best tool for each job or the one you get along with.
>>13958 more like an inflatable hammer it looks solid and shinny, it looks useful, you start thinking about all the things you could build using that hammer then you pick it up and realize its just full of air and utterly useless
My apologies for writing all the software stuff in the math thread off topic. I was looking at random comments and came into that thread where someone was discussing something related to software and I got carried away not realizing the actual topic at hand. I've done this before not knowing about the headings. But I really don't have a good excuse this time. I didn't check the subject header when I responded. Should have. I'll try to to check this in the future.
>>20938 No worries m80. At this point in time, I'm convinced it's literally impossible not to do so within this forum format, by any practical means. As mentioned to NoidoDev, we'll investigate creating a 'janny-bot' that can migrate off-topic posts in a relatively painless manner. Until then, I'll just stay angry at Stephen Lynx for not providing such management with his software. :^) I'll migrate our convo here shortly. >=== -minor edit
Edited last time by Chobitsu on 03/02/2023 (Thu) 03:52:03.
>>20629 >I can already see this wll be harder than origninally thought. Yes, it will be. That's how it always goes! Thanks, Murphy!! :^) >I think you will almost have to learn C to do this. Yes, you will. >I've been downloading books to do this. I have made a couple of C programs and I really hate this stuff. Programming is something that, to me, is satisfying "after" you finish but a real pain in the ass while you are doing it. This. BTW you are exactly the kind of target-audience, for which I'm taking such tedious pains to create a programming class for all of us here on /robowaifu/ using C++ & C (>>19777). Neither language is really intended for beginners, but both are absolutely our best choices today for driving the low-level behavior of robowaifus IRL (for numerous reasons). Hey, if this was easy, then everyone would be doing it, right? :^) It's kind of a matter of 'just how much do you want this?', Anon. You can simply wait around until someone else does it open-sauce, or you can wait and swallop what the Globohomo has in store for you, or you can dig your feet in, work hard, and do it yourself. I'd personally recommend you do it yourself, since it's your best bet in the end. Finishing that C++ classroom here will put you waaay down along that road! (Please pardon me Grommet, I mean you no discourtesy personally; I'm simply using you as the general example-case here. :^) The technology of the near-term timeframe (some of which we're inventing right here) absolutely will support creating basic robowaifus by using clever design work. But it's going to be a long haul up that mountain, Anon. Best to get your climbing gear in order here & now! :^)
>C++ & C I looked at the link and found Bjarne Stroustrup "Programming Principles and Practice Using C++", but what edition? I see one and two. Will it tell me where to get a free compiler for windows? I downloaded some other books C programming_ absolute beginner's guide (2016_2014) by Miller, Dean_Perry, Greg M Learn C Programming - A beginner’s guide to learning the most powerful and general-purpose programming language with ease(2022) by Jeff Szuhay C Programming: A Modern Approach, 2nd Edition by K. N. King Beginning C: From Novice to Professional by Ivor Horton All had good reviews I think in order from best to not as good. I'm looking at the first part of your learning textbook thread and an already thinking of throwing up blood. I hate programming but I just do not know a way around it. I've looked at all sorts of programming languages and learned a good deal about them but in the end...I never had a task that I really had to learn a new language to deal with it and as soon as I started I would say, this is waste of time and I refuse to go through all this pain for nothing. I took FORTRAN ages ago. I once spent about 5 hours on a program that would not compile because...I had one too many spaces in the left margin. The teachers assistant and I spent hours on this. It left a very bad taste in my mouth. NoidoDev ##eCt7e4 mentioned Nim. I looked at that a lot. I was impressed that someone used it to make a Rebol, another language, like clone (called Ni then Spry) with a very few lines of code and I assume meta-programming. I think Nim will compile to RISCV. Now I do not know the mechanics of making this happen but programming languages with meta programming like LISP, Rebol, Red Programming Language and Nim can do almost super human task with little tiny snippets of code that can evolve and reprogram themselves. You don't have to be a programmer to see how useful that can be. We are using limited memory and need to rapidly change things. Of course with that flexibility I'm SURE comes, complexity and serious trouble setting up the code in the first place. I do tend to be wordy. The reason is people complain I don't explain enough if I don't and of course they complain if I do so...oh well. What I'm doing lately is thinking out loud about possibilities. Hoping others will look at what I found and maybe help decide what is the most efficient, least painful way to do what we need. I'm not smart enough to judge some of this stuff but if I point it out maybe someone will come along who is.
>>20948 >Neither language is really intended for beginners, but both are absolutely our best choices today for driving the low-level behavior of robowaifus It's very unfortunate but C is everywhere and on anything you can buy. Have you ever read that article on the New Jersey style programming vs MIT style? I bet you have. C is just good enough to patch something together. It's like VHS vs Betamax. The cheap knock off wins.
>>20949 >I looked at the link and found Bjarne Stroustrup "Programming Principles and Practice Using C++", but what edition? I see one and two. 2nd Edition. It's usually just called "PPP2" on the Internet. >Will it tell me where to get a free compiler for windows? We'll be using the online C++ compiler Coliru for the first few chapters for homework checkins, by default. https://coliru.stacked-crooked.com/ There many reasons to use C++ for our robowaifu's systems-engineering today. You'll probably understand why better as we go along with the class. I guess for now I'd just ask you to 'trust me on this, bro!' heh. :^) One nice comment by Anon was about our realtime processing performance needs, and why Python will not work (at all) as a systems language (>>20852). There are several other possibilities for a systems language, but C++ is our best choice overall, among every one of these. Haha, please don't worry about what any of us thinks Grommet, 'complain' away! It really helps all of us to get on the same page with fewer opportunities for misunderstandings. When you're dealing with highly-complex systems (and we are here), it can become vital to be extremely clear on things. Nice posts Anon. Cheers :^)
>>20870 >New Jersey style programming vs MIT style As a mild-rebuttal to that basic premise, I offer you file-related. > The exact reason that "The cheap knock off wins." is exactly that; it's cheap. Cheap on hardware resource requirement, cheap on performance-runtime requirements. >tl;dr There is, roughly-speaking, a near-linear inverse correlation between ease-of-apprehension, and efficient systems (particularly as concerns wall-clock perf). ASM is by far the closest language to the machine itself, and therefore highly-efficient (both in time & space). Arguably C is the next rung up the ladder and has been called 'portable assembler'. But it offers no real mechanism for abstraction, which is an absolutely essential feature for devising systems as big as the ones needed in real robowaifus. Next rung up is C++, which does bring abstraction to the table. A kind of an 'object-oriented assembly language'. This -- while it may make you want to puke -- is the primary rung we're standing on for this project (with some work on the two rungs just below). I hope you brought a few barf-bags Anon, cause it's gonna be a bumpy ride. :^) --- >Addendum We will be using Python (or, possibly, Lua as an alternative) for enabling Anon at home to piece together scripts for his robowaifus to use. Similar to the way Python is used broadly in AI today to script the backend runtime-engines written (often) in C++. So don't worry if you have to fall out of the C++ Learning class Anon. Please bring your other skills to bear on the hardware side of the table for us all here! BTW, this big programming chore is only hard for us here because we're the trailblazers, the frontiersmen. Someone has to do it, after all. However, much like the advance of civilization, the ones to come along afterwards will have a much easier time of things. You can just lay back and be comfy in that group after a few years, if you choose to instead Anon. :^) >=== -add addendum
Program the ESP32 using Nim! https://github.com/elcritch/nesper I wonder if something like this might suit me better? Not that I know. There's no doubt C++ is the 1,000 pound megabeast that does everything at lightening speed. I do not question it's utility. I worry because I always hear that it has so much to it that it's impossible to learn. That even people who have years of experience never learn it all. A labyrinth of mirrors I was looking at the above link and even it's complexity was giving me a headache. It seems that C and C+++ are all some mutating glob where you can never count on the include files or headers to be the same anywhere. I might be better off just hacking the ESP32 and learning only the stuff I need as I go. I think there's just too much.
I've been looking at Nim videos and the more I learn about Nim, the better it sounds. For those interested in a fast programming language that is set up like python to be easy to understand and then can compile to C, has macros for abstraction and is super fast, and also has libraries for ESP32 and other micro-controllers, Nim sounds SO GOOD. Here's a good video from someone who made a library linking Nim to Arduino programming. https://www.youtube.com/watch?v=0dt3icPj8Kk The problem with C++ is it's just to damn much to learn and to me it appears to be a big disjointed thing. Chobitsu don't take this the wrong way but could you be falling into the "I'm a real man" trap? Like someone who needs a tree cut down and instead of using a chain saw he uses and axe because,"he's a real man". What if using Nim you could be...50% more productive? The numbers could higher. And what if instead of churning away at C++ you could add to the existing libraries like the one in the video? Though it's not real test, I've heard all sorts of damning of C++ but I've never seen anyone damn Nim. Some have said it was not for them but I haven't seen any real damnation like you can find for C++. I read the article on why bad is good and I agree that it is good but for a very limited time because if the bad hangs around then in order to massage all it's faults, people pile more bad upon the bad until you have a huge ball of bad that ties you into a knot. php I'm told s a good example of this sort of thing. And just to throw further fuel on the fire to see if it will flare up really high. https://sites.radford.edu/~ibarland/Manifestoes/whyC++isBad.shtml
In the the math thread I wrote about Nim. It seems ideal. The creator has specifically designed it to work with 8 bit on up . So it's set up to be used with micro-controllers. People have built libraries to ease the transition. Micro-controllers will undoubtedly be used in large amounts in waifus because nothing else can so cheaply control a lot of actuators AND has built in sensor pins. MC's are explicitly designed for these type task. It's cheaper, and less work, to string a mass of these together than it is to build one off boards to control actuators and read sensors. Most of these run about $9 or so in quantity. Maybe more with more options. Rough calculations show that using about 20 of them would be good enough to control all the actuators for a robowaifu. The goal of Nim is so aligned with the hardware of robowaifus it's as if he were making it for us. He noticed a gap where python is really easy to learn compared to C++ but it was slow. He also noticed the really productive stuff like LISP and it's variants, scheme, Racket, were very productive but didn't necessarily have compilers for many different processors. So he created a python oriented syntax for ease of learning that compiles to C or JavaScript to add functionality for all processors and for browsers and the vast JavaScript infrastructure. On top of this he added meta programming as in LISP. It seems idea. Showing how Nim can be used for micro-controllers is a set of videos. One I linked before Next generation micro-controller programming "Zero-cost abstractions for better embedded programming" https://www.youtube.com/watch?v=0dt3icPj8Kk The guy above Peter Munch-Ellingsen has a large number of micro-controller videos where he wrote his own code for his keyboard and used his own micro-controllers to do so with Nim. Now a waifu is not a keyboard but it might show some of the actions needed. Look at the video above and see how super small the Nim examples complies to compared to several other micro-controller languages he used. It a big difference and we need every byte. There's also the idea that the smaller the code, roughly, the faster it runs because it keeps it's code in the MC's fast internal memory where it belongs not losing speed to interfacing with slower memory. I "percieve", may be wrong that one reason that no matter how fast our processors become they never increase the speed of the software on our computers is they pile all these massive frameworks with the kitchen sink ion them. This stuff will not fit in the processors cache memory so it;s always going to main memory dragging it down to PCI bus speeds. Way slower. However if you use meta programming techniques the software can all fit in cache and even if it takes more instructions they all stay in the cache operating thousands of times faster. Nim I believe would allow us to do this. In the video above it says,"Zero-cost abstractions for better embedded programming". So I think he is doing exactly that. There's videos n using meta-programming like LISP for Nim. A good trick to finding and super rapidly downloading these videos is to go to this address. https://yt5s.com/en163 You put in the address of a youtube video and enter it. You can then download it through them by some magic I don't understand. What I do know is it seems to be about at least ten times faster. Another tip is you can search in the above search box. It doesn't give all the links nor many of them but once you find a subject header or a name of a youtube uploader the you can use that subject or the name of the uploader to search and find more videos. A couple of good searches. Nim Programming Language Peter Munch-Ellingsen PMunch Another interesting video talking about meta programming on micro-controllers with Nim "Optimising for Humans - Nim meta-programming for 0-cost abstraction on microcontrollers" https://www.youtube.com/watch?v=j0fUqdYC71k Summation. Chobitsu has said, rightly, that abstraction, which I read as related to meta-programming like LISP, as hugely important to writing good code. I'm not a software engineer but I do get the idea in a rough way and see that it's good. What I rebel against is using C+++ when we have other tools, Nim, that can do just the same. And even if it does not have as many libraries Nim has what are supposed to be several tools to wrap or, another process I do not understand, rewrite the headers(I think) so that the C or C++ libraries are incorporated into your Nim program. Another good video "Nim Metaprogramming in the Real World" https://www.youtube.com/watch?v=yu6SC2vd3qM
Some interesting libraries for Nim futhark Automatic wrapping of C headers in Nim "...Futhark aims to allow you to simply import C header files directly into Nim, and allow you to use them like you would from C without any manual intervention..." https://github.com/PMunch/futhark The same guy has several good libraries https://github.com/PMunch Ratel Next-generation, zero-cost abstraction microconroller programming in Nim(meta-programming???) https://github.com/PMunch/ratel Nesper Program the ESP32 using Nim! https://github.com/elcritch/nesper Note in the above libraries read me,"...Nim now has support for FreeRTOS & LwIP..." So Nim has support for one of the major real time OS needed for micro-controllers, FreeRTOS. I believe the "micro robot operating system" can be used on top of the FreeRTOS. So there's a large amount of code already written for just what we need. https://micro.ros.org/ I suspect, but do not know, that the microROS will work with the larger Robot Operating System, https://www.ros.org/ which even has simulation software, ROS which can be used with larger faster processors. So higher level stuff can be done by the more powerful processor while the microOS runs the micro-controllers. While I'm sure all this is a huge mass of complexity I'm fairly sure it will be faster and easier than writing the whole conglomeration ourselves. Binding to C Libraries with Nim https://gist.github.com/zacharycarter/846869eb3423e20af04dea226b65c18f Nim Wrapping C http://goran.krampe.se/2014/10/16/nim-wrapping-c/ c2nim c2nim is a tool to translate ANSI C code to Nim https://github.com/nim-lang/c2nim I think, though foar form sure, that while Nimmay have some roguh edges in terms of library support for our specific case, if some of the better C and C++ programmers would tidy up the libraries made already for Nim this time would accelerate the programming for everyone and raise all of our productivity towards the goal.
>>20956 >>20958 >>20959 Great research work, Grommet. This post is just a quick note to let you know I've seen your posts. Now that I've moved the conversation, I'll respond to you more properly before too long.
The processor mentioned in the video I linked, "Optimising for Humans - Nim meta-programming for 0-cost abstraction on microcontrollers" ESP8266 is a older version that is now the ESP32, which I'm always talking about, which is much more powerful. The older ESP8266 could probably be used for actuators and save a great deal of money. They are super cheap. You can get six of these for $19 so roughly $3 a piece.
Above video link Nim FASTER than C++!
Open file (46.71 KB 370x488 KnR_2e.jpeg)
OK, Grommet. I'm going to delay a fuller discussion of compiling Nim for our robowaifu's microcontrollers until this weekend (hopefully). However, you're still going to need to know C if you want to do this at a pro-level -- even using Nim. There is only one book to recommend: K&R 2e > You'll be learning the C89 standard of the language. https://en.wikipedia.org/wiki/ANSI_C#C89
Arguing about programming languages misses the point. The hard part is coming up with the algorithms and ML systems needed to run the robowaifu, not writing the code itself. Thus the choice of programming language is largely irrelevant.
Youtube video LISP_ Lex Fridman's favorite programming language @2:10 "...Development of programs on LISP proceeds at, I think, at somewhere between a thousand and fifty thousand times faster..." LISP of any sort will not be on our micro-controllers but Nim will be and it can be used in a similar way.
>>20982 >the choice of programming language is largely irrelevant This is only true for already successful programmers that have a good feel for the subject but those that do not and have no inclination to be programmers the choice of tools is important. How many years does it take to become a proficient programmer, lots.
More like which programming language you would like the waifu to suit you best. What would suit the robowaifu best is python.
First off Grommet, let me thank you again for all your hard work gathering information together on Nim for us here. Great job, and you've made a good case for your position. But rather than leave you in suspense let me just cut right to the chase: There is no way that I'm not using C++ as the primary systems-programming language for this project. There, I said it. :^) Simply put, C++ by far represents our current best-bet for overall success devising real-world robowaifus. Hands-down. I wouldn't seriously think for a moment about walking away from that abundant blessing that has been dropped right into our collective laps. C & C++ are two of the most 'banged-on' programming languages out there and together represent the vast majority of systems-oriented projects in the world. There is also a big heap of applications solutions using them too. The ISO C++ Committee once did a short-list roundup during a Language Evolution Working Group meeting a few years back, and just in that room alone (with ~10 men present) the lines of code they collectively represented as leaders in their respective institutions was over 5 billion lines of working, production C++ code out in the industry. Let that sink in for a minute. And its bigger today. Also, Bjarne Stroustrup compiled a (no longer maintained) list of notable C++ projects out there. You may recognize a few of them (and I also have a good number -- for example, all BMW cars -- which are not on his listing): https://stroustrup.com/applications.html While no language is perfect, C++ brings more power and flexibility to the table, with an amazing array of libraries available to pull from, with more maturity and field-proven capability, than any other programming language. C is also a great language in these regards, but we absolutely need good support for abstractions in the vast majority of the code we're going to devise while developing our robowaifu's systems. Also, there is an abundance of C++ developers around the world, and some are doing quite remarkable work. That is a great potential talent pool for /robowaifu/ to draw from. While not an unavoidably-critical item, it's certainly an important one to consider. I don't mean to be a dick about it Anon, but quite frankly you're basically wasting your time in trying to convince me to use any other language for this project's systems code work. With humility, I claim that there is literally no better option available to us in existence, at this point in time, for that arena. --- OTOH, since the author of Nim very wisely devised a transpiler-based approach for his language, and since it has both C & C++ code-generation facilities, then I think we can with good conscience use it as a sort of 'scripting language' to create code files that can afterwards be compiled in the traditional manner with traditional toolchains into binary executables for our robowaifu's microcontrollers, sensors, and other smol devices onboard. Thus I'd approve of (and even recommend) you pursuing learning Nim on your own for just this purpose. However, you're still going to need to be able to write C to implement this process effectively & professionally. (>>20980) C++ (and C) syntax may make your eyes want to bleed, but it is what it is. Syntax is something you get used to (once you begin applying yourself to learning it). It's the algorithms that are the difficult part. Great libraries can go a long way, but our devs here are all going to have to be very clever, regardless. Please always keep in mind that everything we are doing here -- and especially in their sum-total combination -- is groundbreaking. We're striving for something that literally is on the edge of what's even possible ATM. Almost everything here is going to prove difficult until we have in fact finally succeeded at doing it. You just have to resign yourself to that situation going in. There really are no shortcuts here Grommet. As this Anon stated: >"The hard part is coming up with the algorithms and ML systems needed to run the robowaifu, not writing the code itself. Thus the choice of programming language is largely irrelevant." (>>20982) While he's a bit off-target (since efficiency in both time & space is critical for our specific domain), he's fundamentally correct overall. Thanks for reading my blogpost, Anon. Godspeed to us all! Cheers. :^) >=== -prose edit
Edited last time by Chobitsu on 03/05/2023 (Sun) 18:37:39.
It's exhausting to see all these bizarre literally-who languages anons have suggested. A stack consisting of C / C++ / Python is reliable, mature, has an unfathomable wealth of material to draw from, and covers every conceivable use case from metal to meta. Language-of-the-month debutantes need not apply. Learning C is not the herculean task some people are making it out to be. New programmers balking at this and running back to the shallow end is detrimental to the art and industry overall IMHO. That said, the info presented on Nim is really interesting - and I'd agree with Chobitsu that it could be a useful /supplemental/ tool.
>>21072 >It's exhausting to see all these bizarre literally-who languages anons have suggested HAHAHA I get that. I really do. My perspective is that what ever I learn I will have to do so from scratch so using something that is more flexible and easier to use is better. Whatever happened to the "worse is better", oh that doesn't count when it's C++...right... :) (not meant to be offensive) In the end I'm just thinking out loud. it helps sometimes to write things down and get feedback. I may have to learn some C but if it can at all be avoided I will not crack a book on C++. Most anyone who is a well respected programmer has said it's an abomination, yet they also say they use it, just because it's so widespread and has so many tools. I've been looking at interviews with programmers on Lex Freidman's youtube channel and he ask them what's the best language. They say scheme which any and all good programmers say you should learn because it will teach you good programming. Specifically from this, "Structure and Interpretation of Computer Programs" But not for actual work because it's so malleable that it can be hard to understand after you write it. They do say C and C++ for production but not because they like it. It's because they are working on large teams and they have to have something everyone understands and can use. I think Nim might be good enough and any libraries you need seem to be able to be wrapped or imported into it. >>Learning C is not the herculean task some people are making it out to be. C++ is a different story though. I've actually programmed two programs in C a long time ago illustrating Fourier transforms to make square waves from sine waves and display them. It was no worse than FORTRAN.
>>21076 If your position is to use C / [Nim] / Python, then i think we may have some common ground at least. >Most anyone who is a well respected programmer has said it's an abomination, yet they also say they use it, just because it's so widespread and has so many tools This really gets to the heart of it. Perceived 'ugliness' will *always* be secondary to momentum and availability. The Waifu project is already starting from scratch (or nearly so) in so many areas. There is so much invention/innovation that has to be done - *in tangible real-world space* - that delving into novel languages will only compound the work required. But if you have to deviate, then doing so in a way that can transpile to the lingua franca is alright I suppose.
>>21072 >>21078 Hello Anon, welcome! Thanks for your great insight, and I'm completely onboard with your 3-layer stack. We may find ourselves needing ASM as well possibly, but hopefully not. BTW, you seem like you may have a good bit of experience with C development. If so, then we'd sure appreciate that tired old man perspective within our C++ Learning class (>>19777). Cheers. >>21076 Excellent. Glad to hear you're apparently open to using C, Grommet. Looking forward to seeing how you use Nim alongside it, too. It's all gonna be gud! :^)
>>21072 >A stack consisting of C / C++ / Python i Personally I don't plan to write anything in C or C++, especially not in the latter. Anything more than a wrapper, if it needs to be fast, I'll use one either NIM or some kind of Lisp. I have no patience to read through some complex code for no reason or to write it. Therefore, if other people write the programs in C/C++ and these are then constrained inside the system, I might use them. If I can't trust them based on the access they have and don't want to read through the code, then not. > Language-of-the-month debutantes need not apply. Nim and Lisp existed for a long time.
>>21088 >If I can't trust them based on the access they have and don't want to read through the code, then not. Heh, then you'd better get your thinking-cap on Anon. :^) Because systems-level code has access to practically everything in the whole system, roughly by definition. That's what it's there for; it's the 'glue' that binds everything else together, the foundation upon which everything else stands. You can think of it vaguely like the Operating System inside your robowaifu. And it's gonna be fairly YUGE (and remarkably complex) in the end -- which is exactly why we need/use C++'s abstraction mechanisms for writing it with; to greatly-simplify the problemspace developers have to comprehend at any particular nexus into the overall robowaifu software ecosystem, which in turn can significantly help them to do their work well. And for the general-use case, this massive system also has to run lightning-fast too. Probably about half of the overall systems code has to run as soft-realtime, with maybe 10-20% of it being in the hard-realtime category. And C++ affords all four types (abstract, general, soft-realtime, hard-realtime) of these needs for our robowaifu's systems core (with ASM / C making up any remainder). This central system is what the external world in general, and the Python scripting system in particular, will communicate directly (and only, for safety & security) with. And all the important lower-level systems across the robowaifu's onboard compute & signals internals -- things like microcontrollers & sensors -- will also communicate back up to it as well (these lower system's codes will be primarily written in C). >tl;dr Without robowaifu systems code, nothing works together in a coordinated way at all. >=== -prose edit
Edited last time by Chobitsu on 08/05/2023 (Sat) 08:40:48.
>>21088 Participate, piggyback, or pipe-dream. Sounds like you are opting for #2 or #3. I'm through with debating; do what you will.
>>21091 >systems-level code That's part of the problem in this thread here. It's not about which language for what and if something is even necessary, but all kind of mixed together. I don't know about your Robowaifu Foundations code. Why there would be one set of programs being able to access everything and need to be written in C++. Anyways, this is a project of it's own. Doesn't mean everyone will use it, or all of it. >That's what it's there for; it's the 'glue' that binds everything else together, This is exactly where slower languages are often being used, as much as I'm informed. You do the heavy lifting in code close to the machine, but other things in a high level code. >vaguely like the Operating System inside your robowaifu I will consider that if I'm ever convinced about it. >>21103 I have no idea what you're talking about. I'm participating in the board for years, I'm just not joining this idea of a board project. If this here is about a specific project or program, then this is one thing, but this here is a general thread about programming for robowaifus. So maybe discuss something you have agreed upon in a dedicated thread, where it is already clear which language is being used. >I'm through with debating Debating what exactly and when? This thread turns tense every time, there's only some preaching and ranting at the end, and we part our ways for a while. You do what you do anyways, you don't need to announce that.
I am aware that the final product usually is written in C(not C++) but we're prototyping and that is usually done in python. The final product can be written in python, it doesn't matter that much.
>>21128 >That's part of the problem in this thread here. It's not about which language for what and if something is even necessary, but all kind of mixed together. Actually, the broad topic of software of all kinds is exactly on-topic AFAICT Anon. >"In this thread, we will discuss these questions and more so those of us who aren't already settled into a language can find our way." (>>128) As to your general point regarding high-performance code (or the lack thereof), the real world is going to have a way of schooling all of us in this regard I'm quite certain. My perspective on it is that our designs are going to need several extremely-tight kernel loops, running continuously on a number of different internal boards, handling a wide array of inputs & outputs, all at the same time, without stutters or pauses... to even make effective robowaifus possible IRL. This highly stringent requirement will need a high-level, abstract language that's reasonably close to the metal to solve it (and one that doesn't also carry a yuge amount of baggage such as R*st along as well). C++ is the clear winner here (and also why I dedicated myself years ago to learning the language). But by all means; certainly you (and every other anon here) please prototype a robowaifu system using Python or w/e programming language you'd care to. Maybe you'll teach us all a thing or two that way Anon, who knows? :^) >=== -patch hotlink -prose edit
Edited last time by Chobitsu on 03/06/2023 (Mon) 20:06:14.
>>21142 I plan to use several internal computers, which would do their work locally and then only send some signals or short strings to other computers. I want to use whatever language seems to be right for a certain job and I'm willing to dig into. If I use someone else's code I need to be able to trust it and I will constrain it as much as I can. I don't plan to look through big piles of C/C++ code on my own, and then do this again with every upgrade. I will certainly not give something like that full control of everything. So better make it mature, no upgrades necessary for the next 30 years after certification. >Actually, the broad topic of software of all kinds is exactly on-topic Anon Arduino uses C++ anyways, for example. But then we're just using the libraries or short snippets and changing some values. Even many hobby programmers know that the speed of code makes no difference if it's only executed a few times, or occasionally. My point is, if some C++ anons want to make everything about their favorite language then they shouldn't annoy others with it. If there's a project to make everything into a C++ Robowaifu OS, then work on it and don't come over here with dismissive "real man" comments like "tired old man", "Language-of-the-month debutantes" and "bizarre literally-who languages". I think I will avoid discussion about programming language here in the future, maybe even hiding this thread, and just look in other places what they recommend for a certain job. Basically all of these >>13955 plus NIM and maybe things like Prolog are potential candidates.
>>21146 >So better make it mature, no upgrades necessary for the next 30 years after certification. Lol. >My point is, if some C++ anons want to make everything about their favorite language then they shouldn't annoy others with it. Also lol. AFAICT I'm the primary """annoyer""" here on /robowaifu/. :^) I already urged you to create a system using Python or any other language Anon, and I meant it. I honestly look forward to seeing what you manage to come up with. You already know the course I'm set upon. Godspeed to us all. Cheers.
>>21148 >Chobitsu Not being judgmental but I don't understand why you keep pushing C++. My understanding from major very, top experienced programmers is that it's somewhat of an abomination. They only use it because of it's libraries and the resources behind it. Have you even looked at Nim? You keep mentioning Python, for good reason, but Nim was made to copy the programming ease of python, use the infrastructure of C, C++ and JavaScript, speed of C, while adding the metaprograming of scheme in an easier to use, more structured (meaning easier to understand later) form. It's well known from most everyone that meta programming can give you huge productivity gains. It would seem that Nim with maybe a little scheme thrown in would move things forward much faster. Nim seems ideal for all uses and cases. I can't see learning C++. Too much detail. No one knows all this, even people who have been after it for years. Something like scheme may be really fast coding but the code becomes so specialized that each section of it, I think, appears to become it;s own language so no one can understand it.
>>21200 Lol. Hey Grommet didn't we already have this discussion before? :^) >Have you even looked at Nim? I did look into it sufficiently to confirm that yes, it was a transpiler-based design and yes, it can output both C & C++ code files. That's the extent of my research on it ATM, since I have a lot on my plate r/n. But I thought you yourself were going to investigate it for us all Anon? Regardless, as with my advice to NoidoDev, I'm all for you pursuing developing robowaifu systems in Nim or any other language you care to. Want to teach us about it? I'll be one of your best students I imagine. Godspeed. >Not being judgmental but I don't understand why you keep pushing C++. Thanks, and I don't take it that way at all from you. So, if I haven't made it all clear enough yet, then I hope it will become at least somewhat more clear to you during the course of our /robowaifu/'s C++ Learning Classroom (>>19777). While Bjarne Stroustrup's plan of discussion for his lectures is on a broad and diverse set of industry topics related to programming itself, mine are rather more narrow. Namely, robowaifus and all the pertinent systems thereto -- both within and without their physical manifestations. I think this should make the learning both more fun, and also more interesting b/c it's clearly pertinent to what we're all striving for here. In some ways, I think we have the literal best-angle for a computer programming class of anything out there. We're building robowaifus after all! :^)
>>21146 > just look in other places what they recommend for a certain job And when you get to the part about your robot needing to operate in hard-realtime, you will give c/c++ another look. Especially since your plan (as of now) hinges on passing messages between loci. >don't come over here with dismissive "real man" comments You're being just as dismissive of C by mocking those posts as 'real man' poseurs. Get real. >>21088 >Nim and Lisp existed for a long time. And have always played second fiddle to C. You strike me as a purist, and that's fine. But if you absolutely refuse to get your hands dirty you'll never get anywhere in designing control systems. Good luck, anon.
>>21203 >didn't we already have this discussion before? NApologies. No iontent to torture you. I was wondering in the vein of, "what am I missing" if Nim, seeming so much easier, is not seen as a good option while C++ is. I won't ask any more. I may talk about Nim if I find something I like but otherwise I'll leave it alone.
>>21214 Point taken. You know sometimes when you write things it comes off as aggressive when...that was not the case or intent. I'm talking about me not you. So maybe my wondering about things, are sometimes taken out of context and not really what people think they are. Generally if I'm being aggressive or contemptuous you will know it. I'll be very blunt.
>>21232 >NApologies. No iontent to torture you Haha no worries mate. I actually enjoy discussing this stuff, because it helps our newcomers realize an important part of what we're all up against here (hard-realtime), and not to just naively plod on in a fruitless direction and then leave disillusioned and discouraged. It's happened here before BTW. :/ >[Nim] is not seen as a good option while C++ is. I've tried my best to make it clear that, together with the very eminent programming language C, C++ is our best option IMO. I would be pursuing something else for our primary language choice here if I believed anything else, trust me. Rust, Go, Ada, D. All of these are reasonably good as systems language choices, but not as good as C++ is today. (Ada is disqualified primarily b/c of it's exceptionally verbose syntax, along with the lack in abundance of qualified developers. Otherwise, it would likely be our best choice here on /robowaifu/, I think.) Heck, even C++ is a good 'alternative' choice (and getting better quickly). What I mean here is that reduced variations on the ISO standard language, like Circle [1], and Cppfront [2], are being investigated as viable language-evolution pathways. But the C++ Core Guidelines [3] still far-and-away lead the pack for this C++ language research area. BTW, we'll be discussing the proper use of the CPPCG recommendations during the latter portion of the C++ programming classroom -- after our beginners have gotten up on their feet with the language's syntax. >I may talk about Nim if I find something I like Excellent! Please do so Grommet. And the same to every anon here... don't let my goals for us sway you in the least from pursuing your own interests in these affairs. That's how we all learn and teach each other together! Cheers. :^) --- 1. https://github.com/seanbaxter/circle 2. https://github.com/hsutter/cppfront 3. https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines >=== -prose edit -add C++ evolution links
Edited last time by Chobitsu on 03/10/2023 (Fri) 06:37:02.
I've avoided talking about this because this language, though it's probably one of the most exciting things ever, is incomplete. If you are a programmer you owe it to yourself to at least look at the first link to get a taste of the power of this language. The reason I'm talking about it is because it has so much promise and it may be of instant, right now usage to some that need super fast written programs for daily use. It's called REBOL. REBOL was written by Carl Sassenrath, the guy who wrote a lot of the first PC multi-tasking operating system, the Amiga computer. Rebol is scripted so it's not super fast but it's not slow compared to other scripting languages[note others are building compiled versions]. He did this about the time open source became all the rage and he charged a stiff rate for it. Back in the day companies like Borland made big profits selling programming languages but open source wiped out most of those profits. So Carl did the right thing, at the wrong time. Didn’t take off. No one was paying for languages anymore. Many people that have used it consider it the best, most productive language ever. It’s probably the most forward looking programming software language ever. Ok so after it languished for a while with a few fanatical users he open sourced it (called R3). While it was closed source another very talented user of Rebol decided to write an open sourced Rebol with a lot of advances himself and make the whole thing BSD licensed PLUS it will have a compiled system language like C and C++. It’s called (Red Programming Language(RPL) or (RED)). It’s not done yet but it’s fairly far along. Now your thinking blah, blah another wacky language. NO. It’s very deep. It is like a sort of human language. Words can mean different things in context. It is sort of like LISP and sort of like forth and Logo and sort of like a lot of other languages. A big tell of it's productivity is it’s small size because it takes much less lines of programming to express programs. The whole interpreter with a built in GUI is less than a megabyte. It also has Domain Specific Languages(DSL) support built in. So you can build your own DSL. Some DSLs are HTML or Postscript or FLASH or in the case of Rebol the user GUI is actually a DSL called VID. Also includes vector graphics, in less than a MB. Networking and databases are included in this tiny base interpreter. And much more. There are several versions including the original Rebol(now called R2). RED(RPL) and R3(Reobol 3), Meta that are all mostly the same. They all use the same sort of syntax and ideas to function. You can learn Rebol and use the same syntax with R3. Meta and RED. Rebol and R3 are used in production systems now, so these are immediately useful but are not really growing. Now the R3 branch has also split off branches. There's one base branch, then there's a branch that a company who does automated supply chain work. They automate factories they maintain it. There's a guy nicknamed "Hostile Fork" who has taken R3 apart and really optimized all the functions. If that isn't enough there's a guy, Kaj de Vos, who I really think will do some good, who has a Rebol but he's rewritten it such that it's complied and is very fast. He calls it Meta. Now this may not interest most people but if programming languages interest you OR you know a tiny bit of programming and want to make GUI programs extremely simply then this will will interest you a lot. Here's a super quick overview by one Rebol user who used it in his business to write all sorts of programs. He has other tutorials also. The Easiest Programming Language: REBOL By: Nick Antonaccio http://easiestprogramminglanguage.com/easiest_programming_language.html Here's Nicks forum and a link to Rebol resources http://www.rebolforum.com/index.cgi?f=home Important links(a few of these may be gone and you will have to use the "internet archive" to see them. http://www.rebolforum.com/index.cgi?f=printtopic&topicnumber=47&archiveflag=new You can get the original REBOL here. It works fine right now. You will stupefied and amazed at all the functionality this packages in a super small package. I mean this, you will be amazed. Go to the get-it page to download. It has GUI and non GUI models for most OS's. http://www.rebol.com/ Kaj de Vos Meta site(read this it's interesting and has a good overview of why REBOL is good) he wrote a lot of, maybe most, not sure, of the Syllable operating system so he knows what he is doing. https://language.metaproject.frl/#get This is Red Programming Language. It's fairly far along. You can use it now. https://www.red-lang.org/ Here's the REBOL version 3 mostly referred to as R3. Work is being done on this but mostly elsewhere. http://www.rebol.net/ Here's the forum for Hostile Forks version of REBOL “Ren-C” Rebol Variant. Part of the problem with his version, for me, is he is so deep into it I can't understand what he's doing. He's doing some super deep thinking about the guts and semantics of the system and it's too much for me. I can't follow. Maybe people smarter than me, not difficult to find, can make better use of this. https://forum.rebol.info/categories Github https://github.com/metaeducation/ren-c That's a good start. Now you might ask why deal with this crazy thing? Look at all the people that have taken up REBOL and tried to modernize it after Carl abandoned it. Carl put over a decade into this and I don't think he made any money. He's growing grapes now like a good former silicon valley executive and I think he's just done with it. But look at all the people that saw this huge potential. People have spent years working on making REBOL into what it could have become with support. This says something about REBOL. It's special and people are willing to put their entire lives on hold to expand it. This sort of dedication to an idea is hard to come by so it might be something you could also be interested in.
>>22017 What a big screw up. I meant for the very first link to be the internet archive version. Sorry. https://web.archive.org/web/20210123002145/http://easiestprogramminglanguage.com/easiest_programming_language.html
I can't help but think if this could be set up such that we can easily use it, BUT with newer compiled versions for speed, then this would really correspond to the basic ideas needed for AI and for waifus. My favoring of Nim is really because it's the closet thing to REBOL that easily complies. REBOL would be much better. Note this statement by Carl on the idea of Rebol, ".. In REBOL, most expressions are created as a series of functions that are evaluated mathematically and produce a flow of results...." http://www.rebol.com/article/0188.html .A pile of Rebol learning links REBOL in Ten Steps-It presents the basic ideas of REBOL in ten steps that should require only a few minutes to read. http://rebol.com/rebolsteps.html Nick Antonaccio says it much better than I. From https://web.archive.org/web/20201029055852/http://easiestprogramminglanguage.com/easiest_programming_language.html “…REBOL is the simplest solution for creating cost saving custom applications which improve and enable business processes of all types. If you’re an IT professional or the local tech guru searching for a powerful scripting tool to manage and interact with a wide range of common computing technology, including web site scripting and distributed network application creation, REBOL is the easiest tool to provide that power. If you’re looking for a fun language to teach children, which will do more than just demonstrate fundamental concepts, there is nothing more immediately understandable and directly usable than REBOL. REBOL requires far less wading through complex, foreign, and confusing initial learning “cruft”, than any other language, and it enables even children to actually accomplish real programming goals, without the disappointing or boring technical limits found in “teaching” tools. REBOL includes amazingly simple to use GUI, graphic, network, email, math, parse, and other capabilities built in – no external libraries required. It’s all included in the 1/2 meg download. And REBOL’s learning curve is fast. There is only one “series” format to learn, and the exact same functions apply to managing tables of data, text strings, network ports, emails, graphic display lists, files, etc. REBOL not only replaces other commercial desktop and mobile development languages such as Visual Basic, C, and Java, it also allows you to do all the same things as web server scripting languages such as PHP, Perl, Python, and Ruby. But that’s not all – it also functions as a graphic and multimedia platform such as Flash, and replaces DBMSs such as Access, SQLite, and MySQL, a variety of system utility applications, and more, all with one simple paradigm…” I know all this sounds ridiculously amazing but it is true..." Learn REBOL http://re-bol.com/rebol.html http://www.drdobbs.com/embedded-systems/the-rebol-scripting-language/184404172 Learn Rebol: Writing More Useful Programs with Amazingly Small and Easy-To-Understand Code http://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201403.pdf#page=22 Learn Rebol: Writing More Useful Programs with Amazingly Small and Easy-To-Understand Code – Part 2 http://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201404.pdf#page=22 Programming Your ODROID-SHOW: Using the Rebol Programming Language to Improve the Hardware Interface http://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201406.pdf#page=6 Programming with Rebol: Reducing Complexity in Development http://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201402.pdf#page=29 Learn Red: The Next Evolution of Rebol – Part 1 http://magazine.odroid.com/wp-content/uploads/ODROID-Magazine-201403.pdf#page=25 http://www.re-bol.com/rebol.html http://www.rebol.com/tutorials.html http://rebol.org/ https://web.archive.org/web/20190222084130/http://www.codeconscious.com/rebol/ http://www.rebol.com/docs/core23/rebolcore.html http://rebol.net/ Few links on the "idea" of Rebol http://www.rebol.com/article/0188.html http://www.rebol.com/article/0103.html http://blog.hostilefork.com/arity-of-rebol-red-functions/ http://blog.hostilefork.com/why-rebol-red-parse-cool/ A Rebol vs. Python Performance Test http://blog.hostilefork.com/rebol-vs-python-performance/
>>22017 >>22028 >>22033 Nice posts, nice information Anon. Thanks! :^)
>>21057 Unlike with C++, Rust's tutorials and the compiler itself by default steer you towards the most memory-safe way to program. For example do you really want to deal with the endless malloc/free || new/delete vs. smart pointer debates and deal with the inevitable memory leaks that every single C++ project is plagued with because some people refuse to use the latter when Rust has this settled? When even Linus Torvalds is allowing Rust into the Linux kernel and Microsoft is now rewriting parts of the Win32 implementation (not the public API headers) in Rust you can't continue to dismiss it as a "literally-who language."
>>22285 >malloc/free || new/delete vs. smart pointer debates and deal with the inevitable memory leaks that every single C++ project is plagued with Lol no. Simply don't leak is the proper answer, and its not at all difficult to achieve when writing purely in C++ code. Even beginners here will be learning just how to do exactly that (that is don't leak), and why its important. >Rust Well, if you build a fully-functional robowaifu system using R*st Anon, I'm sure some out there will read your code. However, 'I wouldn't touch it with a ten-foot pole' as the saying goes haha. :^) >=== -prose edit
Edited last time by Chobitsu on 05/04/2023 (Thu) 06:48:45.
Mojo - a programming language for AI, build on Python but also using C where it helps. Up to 35000 times faster than Python. https://youtu.be/ssZ4wKkiDSY
>Nerves is an open-source platform that combines the rock-solid BEAM virtual machine and Elixir ecosystem to easily build and deploy production embedded systems: https://nerves-project.org/ AtomVM - the Erlang virtual machine for IoT devices: https://www.atomvm.net/
>>20980 Can you elaborate? Why ANSI-C89? Because of the book or you don't like the newer features?
>>22689 >Why ANSI-C89? Because of the book or you don't like the newer features? Because of the book. It and the C89 standard are basically equivalent. Actually I highly-applaud some of the newer features of C17 & C23.
>>20958 >>20959 >>22017 >>22033 Thanks for the links on Nim and Rebol, I only could glance at it but it looks very interesting.
>>22289 >Simply don't leak is the proper answer This. There's several layers to this statement, it seems obvious for a beginner, foolish for someone with some practice, and obvious once again for someone who is experienced. There comes a certain point in skill where the occasional leak still wastes less memory than having a garbage collector, and it's not hard to get there. And there's a step above where you are experienced enough in writing and reviewing software, as well as using tools that help you with detecting leaks and other such bugs like valgrind where the bug count (including leaks) in a established code base will only decrease over time. It's really not a big deal, some people have just created a sort of mysticism around malloc() and free() when in fact there's none in something so simple. Also, even if you have automatic memory management, the principle of "acquire resource" and "release resource" is everywhere. Why is nobody talking about garbage collecting file descriptors, processes, devices (usb connections, graphics cards, etc), and the like?
>>22722 >Why is nobody talking about garbage collecting file descriptors, processes, devices (usb connections, graphics cards, etc), and the like? Actually, our PPP2 textbook talks about resources of all types (including those you brought up & more) and how not to leak them. Mind you, this is a college freshman textbook, not some profound 'mystical tome of programming' (cf. Knuth, et al). This is all straightforward stuff, and we'll be covering many of these directly during our C++ course; beginning with memory as the most basic and fundamental resource of all. (>>19777) I consider this issue such a well-solved area that I initially regard any hand-waving on the topic to be a) the thing to do (and therefore likely done in ignorance as 'social fashion statement'); or b) done with a specific agenda against the two primary, established systems programming languages (to wit: C & C++). Simple as. >=== -prose edit
Edited last time by Chobitsu on 05/24/2023 (Wed) 02:58:45.
Open file (61.81 KB 499x500 download (3).jpg)
I can program many things, I have 16 github repos, I haven't read a damn thing other than tutorials on the internet. Just head on over to https://wokwi.com/ and learn how to program for the arduino which is basically telling which thing to turn on and off. An arduino has 14 input/output pins which means there are 14 things you can turn on and off with one.
>>22726 Neat stuff Anon, thanks!
Open file (33.08 KB 474x379 OIP (6).jpg)
>>22733 I just did just now with the help of chatgpt https://wokwi.com/projects/365936533134899201
>>22836 That's quite a neat tool.
>>22839 I knew I had one laying around. It has no lights though so I'm not sure it's turned on when I'm using the female dc plug thing. I still need to get what I think is the mini USB to plug it into the computer though.
I've mentioned Ada a few times during the board's history, but may as well throw another anon's views in the ring ITT as well. https://hackaday.com/2019/09/10/why-ada-is-the-language-you-want-to-be-programming-your-systems-with/
>>22836 You already fucked up, how much time passes between each sleep? That timer is going to drift. You should use `clock_nanosleep()`. Also, the loop could be simpler. The most obvious (i.e. "clean") code only needs 2 digitalWrite() calls in the entirety of loop(). And is that C++ or is it C? Because if it's C, your functions are using old style parameter lists.
Open file (2.01 MB 4000x3000 IMG_20230530_155648.jpg)
>>22860 Tomorrow it begins. I need to go to sleep . And I'll be using sub optimal code generated by chatgpt bahaha
>>22862 Great! Please post your work in our Prototypes thread (rather than off-topic here ITT). TIA Anon, good luck!
>>22863 thank you
>>22863 With code about blinking LEDs? Please don't. Get a Gitlab or -hub account.
Open file (515.54 KB 498x380 bulma-finger.gif)
i still haven't forgotten. I'm making the blinking leds tomorrow or the day after tomorrow. >>22866 ca you post the code on pastebin or github? thank you.
>>22869 >ca you post the code on pastebin or github? thank you. A possible alternative might be to follow the training in out Embedded Programming Class Anon. (>>367) In fact if you do, then I'd say just post your responses there. That thread needs reviving anyway! Cheers. :^)
>>22844 >Ada I suggest languages that I believe would be easier to get started with. Not totally foolish as progress helps drive the work. Ada while likely harder to use and way more typing you might end up saving a vast amount of time by using a language structured from the start to not allow a myriad of pesky bugs that can be really hard to find.
Open file (50.00 KB 1590x627 Mojo speed.jpeg)
I run across this podcast Chris Lattner_ Future of Programming and AI _ Lex Fridman Podcast This guy is major hard hitting guy. Clang compiler, "...Chris Lattner is a legendary software and hardware engineer, leading projects at Apple, Tesla, Google, SiFive, and Modular AI, including the development of Swift, LLVM, Clang, MLIR, CIRCT, TPUs, and Mojo..." on compilers and AI. So he has this new language called Mojo I'm listening to him talk about this and it has a huge amount of the stuff that I talked about for Rebol and Nim. Nim I saw as a continuation, sort of, Rebol but using python syntax. He's taken this further. He's set up a superset of python, compile python, while also using Mojo to compile for all the individual offshoot processors for AI. I can't see how he can fail in this. I suspect he will rocket ahead with this thing and it will get tons of support. The reason being python is almost obligatory in AI, has a shit ton of libraries and this guy knows exactly what he is doing and knows all the right people to make this work. He's speeding up python code by a huge amount. I think Nim and the newer complied Rebol, Red, Meta look great but this will probably rocket ahead in support and usage far, far, far beyond some of the others. It has metaprogramming and DSL(Domain Specific Language) support. Search for the video. I suspect this will bury C++(sorry Chobitsu). There's just so much python used and it's domination of AI, it's hard to see how this guy can fail and given who and what he has done. Likely guaranteed home run.
>>23461 >I suspect this will bury C++(sorry Chobitsu) Lol. No apologies Grommet. I'm quite certain suspect it won't, but I certainly applaud any non-pozzed, systems-oriented languages. And I certainly encourage every true Anon here to make time to learn such. Particularly the most important systems languages of all: C and C++ .
I'm kind of tired from working on this so I'm going to leave it to the programming geniuses around here and see if they can handle some python. Here is what I have so far: https://github.com/peteblank/waifu-conversation/tree/main I'll be making a video about it soon. I just need to play the mp3 at the very end really. I think archlinux might be giving me a hard time.
>>23473 >I think archlinux might be giving me a hard time. My guess is you may be a bit ahead of yourself Anon. Might I suggest either Mint [1], or Ubuntu [2] ? They both have great support teams to help you out. Good luck with all your programming work Anon! :^) 1. https://linuxmint.com/ 2. https://ubuntu.com/
>>23474 I tried play.sound(output) and it gave me the following error:
>>23475 [ALSOFT] (EE) Failed to connect PipeWire event context (errno: 112) I don't know why it posted by itself. Anyways that's why I added a bash script...
>>23476 >[ALSOFT] (EE) Failed to connect PipeWire event context (errno: 112) I'd suggest you a) switch to Mint or Ubuntu first, and b) search that exact error message (in quotes) if it still happens again. Installing a new machine shouldn't take you more than about 15 - 20 minutes or so on reasonably newer hardware, so low-level investment of time to use that route. Good luck!
>>23477 I just used another terminal and it worked B)
>>23461 >Chris Lattner_ Future of Programming and AI _ Lex Fridman Podcast Eyy, that podcast is what I wanted to recommend. He's involved in Mojo (Lang) and some other platform: https://youtu.be/pdJQ8iVTwj8 to make AI development faster: >modular The Modular Engine unifies AI frameworks and hardware and delivers unparalleled performance and cost savings. https://www.modular.com
>>23486 how can a language even have anything related to performance
>>23479 GG >>23486 >The Modular Engine unifies AI frameworks and hardware and delivers unparalleled performance and cost savings. Haha reads like ad-copy, Anon. :^) >>23487 In a lot of different ways. Generally, the closer the language 'sits' near to the actual hardware itself (that is; the CPU, it's memory subsystems, etc.), then the higher the performance. C & C++ are the two general-purpose languages that have excelled at this long-term (for about 50y & 40y, respectively).** ASM (machine-specific assembler) is really about the only practical language at a lower level than these two. The tradeoff here (all engineering is a series of tradeoffs) is complexity. As developers, the closer you are to the hardware, the fewer 'nets' you have to protect you (and you have more freedoms, too) -- you must manage many things directly yourself. But systems devs don't generally want many of those nets anyway, because it 'bloats & slows' the system. C++ & C (with a little bit of ASM) are our preferred systems languages for creating IRL robowaifus; in that order. These three choices are most directly-related to the language's performance & stability. --- ** Side-note: they are also both ISO international standards (a form of sovereign treaty), so they are extremely stable (a yuge win for /robowaifu/, et al). >=== -prose, fmt edit
Edited last time by Chobitsu on 06/25/2023 (Sun) 21:13:35.
>>23491 >Haha reads like ad-copy, Anon. It does buty...this guyhas major sills. He has doe some really heavy lifting. Clang compiler, swift language...and every ne of these he has seen what can be done, what can't and what they didn't do. I personally think python is icky but, people way smarter then I disagree. Not difficult to find them. My understanding is he is also rewriting the python C compiler, or tuning it up. The speed increase should be high. I linked a guy the other day that has been programming for 40 years and he was, and is, a huge fan of Rebol. He is now using "Python Anvil framework". He says while it's huge it does everything extrmely fast. He devolps for people online and makes his GUI's i abrowser. He says Anvil just breezes through everything.He can in real time change aspects of his clients through a browser or give them several choices. He says they love it and his work is super fast. ne of the keys he says is python has these massive libraries so he can stitch together stuff super fast for anything he can thing of there's likely a library for it. He's also likes the browser "metro" interface elements and uses it a lot. http://www.rebolforum.com/index.cgi?f=printtopic&topicnumber=46&archiveflag=new It's like JavaScript. Not the best but it's everywhere. I suspect python with Mojo will do the same as you will be able to use all this python libraries and then tie that into AI code AND AI specialized hardware with Mojo. @ >>23487 If you watch the video I linked @ >>23461 You will see he talks about exactly that subject.
>>23507 >It does buty...this guyhas major sills. He has doe ARRRGH. My typing so poor. Apologies. I forget that little ass window and sometimes forget to scroll up and check.
>>23507 still just a language, if its just for ai then with ai its not so much a language problem, more so that gpu makers dont publish their isa like with cpus ( the use of gpus itself is purely adhoc they are only used because its hundreds of small cpus in parallel that execute in waves, ideal for graphics processing which just so happens to overlap with nn, its still not designed for this no matter the marketing ), so ultimately it doesnt matter what language you use so long as no one can make compilers for gpus, ever language has to use their shitty driver api it doesnt really matter how fast you call the api, no one wants to use an api to begin with,, the closest you can get is spir-v which is still just an api in disguise, either a hardware manufacturer shows up that doesnt have a stick up their ass and makes simd processors similar to a gpu but not designed for graphics with a shitty api just general simd computation or they all finally agree on a standardized isa (not a fucking api ((spir-v/opencl)) ) like what happened with cpus and x86 although i think that only happened because of lawsuits not the greater good
>>23507 >He has doe some really heavy lifting. No I totally get that Grommet. And I don't really think I have a case of either arrogance, nor foolishness in this situation either. With humility I suggest its quite the opposite in fact. We are all in the middle of a yuge sociological warfare, that evil forces have arrayed against all virtuous manhood (males specifically), to destroy both us and indeed the civilization we have built, that the entire world rests upon. These may seem like grandiose terms -- until you simply look around you. The Globohomo Big-Tech Government is very clearly engaged in a methodical attempt to eradicate the White race, followed by men in general. Their machinations are the sole reason that /robowaifu/ even exists in the first place; to fight against the degenerate effects of the Globohomo's literal favorite pet: feminism. They have already managed to do great destructive evil with it as every regular here is quite well-aware. Remember how I said "...but I certainly applaud any non-pozzed, systems-oriented languages" (>>23470) before? Well feminism & all the other -isms that immediately follow-on from it are all part of this exact same phenomenon going on within the software & technology fields generally. Why do you think we here on the Internets call G*thub SJWhub? Or it's loud, politically-correct proponents CoC-suckers? Tech is clearly a big weapon in the hands of the GH and it's all-too-willing golems. They will cancel you out, and do everything they can to hinder or stop you if you refuse to toe their line of affirming all their usual suspect's usual, evil bullsh*te. But thankfully, by the grace of God, we have some software tools that are largely immune to their shenanigans & meddling. The two most important ones being -- you guessed it -- C & C++ . This is due to the fact that before the GH made it's big lockdown moves post-9/11, post-Gamergate, post-Current Year, these two very powerful languages were already cast under the auspices of the ISO, and had already been adopted as international standards. This means that the countries themselves must cooperate together to maintain the status-quo with these language definitions, and that no single GH entity (like, say, A*ple or G*ogle or the Wh*tehouse) alone can pull their cancel-culture plug on any groups """improperly""" creating powerful anti-feminist systems like robowaifus """abusing""" these programming languages & related software technology, etc. This wonderful situation for C & C++ is markedly different than, say, R*st, where -- let's be honest -- 'CoC-sucking' is their middle name haha. :^) >tl;dr As incredibly politically-incorrect as we all are here, we simply cannot put ourselves at the """tender""" mercies of these demonic organizations that are attempting to abscond with the entire technological basis of civilization around us. We can't. >ttl;dr Don't drink their koolaid bro. Remember who is in back of, and driving these agendas. They are no friends of humanity. >=== -prose edit
Edited last time by Chobitsu on 08/05/2023 (Sat) 08:47:20.
>>23513 >I don't really think I have a case of either arrogance, nor foolishness If you read it that I was accusing you of this, then I didn't word it right because I in no way meant that. I'm fully, really fully on board with all you said. The whole entire, manufactured, culture we have is vile and evil. I'm not always so good at expressing what I mean, though in my defense most of these things are complicated. When I keep pushing other languages it's mostly not because of my not believing in C and C++ capabilities. It's because I do not think MY capabilities on such difficult languages are up to the task. I expect that I'm not the only one in this class, so I try to finds ways around this. I comment on what I find that can possibly make this task easier. I do believe that there are easier ways and better languages that can do the same thing that C++ does with small performance hits that with modern hardware amount to not so much. There's always the basic "the enemy of the best is the good enough". I'm looking for good enough some places and the best in others. (My hatred of seams in skin suits). Now the video I mentioned here >>23461 and linked here by NoidoDevhere, >>23486 I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. Now maybe he's full of it but if he can pull this off it will be very beneficial. One thing I really like is he is doing all the low level work. He is expressly saying things that I worry about. He says that he is looking at the low level, the physics of how processors work to get the best speed then he is building a framework to work with, and he is specific, on the thousands of different types of graphics, normal and specialty, processors. Then by making this framework and laying python over it you can stay in the higher level functions and still get high performance. Yes there are some additions. The gist I got of these so far is typing, and setting up the ownership of various variables instead of pythons allowing this to not be specified(for speed. There is more but I don't understand it yet). However his goal is that if you do not wish max performance it works like regular python. He's talking a year before they have a hard core package but some of it works now. Now for you this may be no big deal for you but for most, who are not C++ programmers. Being able to string together libraries to do task is super great stuff. This has real world consequences. had a idea. What does it cost to increase power? I lined a chart of processor power, in general, and cost from Tom's Hardware.(well this failed so you have to look it up yourself) Let's say you use the lowest cost $99 to get a waifu walking around. But then you want speech and maybe a few other functions. To double the power cost you about $300 more. Now what is the cost of the time to write all this code in C++ compared to stringing together some python libraries and using this guys Mojo to speed it up. When you start comparing the difference in time spent on this to real dollars, higher level languages start making a good deal of sense. And since the processor power keeps going up and cost lower then upgrades, while using higher level languages to upgrade without all this bit twiddling, become even more attractive. Now Moore's law may have slowed down in density of transistors I don't think it has slowed much on over all increases in power. They are paralleling things which is fine for our task. And even the so called "end" of Moore's law is premature according to Jim Keller, who is not some guy living under a bridge. I've seen talks where he lays out plans for a path to 50x gate density.
Well in this case the fact that it's made in python does matter cause it's pretty slow, although I'm not sure if it's the programming language or the api. I'm going to redo it in bash.
>>23524 >I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. I'm downloading a lot of such talks as audio, then using them like (audio) podcasts. The downside of this is, that making notes is hard while walking around and a bit of information in form of facial impressions and such is lost. Generally it's a good thing though if one has to wait somewhere or want's to walk around for some hours. Just make really sure to not walk into a car, if you try the same. You don't need to understand everything at once but replay it later, but also I recommend to use podcasts for programming beginners first. I can't give you a good hint here, since mine was in German (Chaosradio). I think I listened to English ones as well at some point, but don't remember a specific name. "Talk Python to me" podcast might also explain some details, but it's mostly about specific libraries. It can still help to listen to it, even if you won't use the library they talk about, since they explain why it's necessary and talk about Python in general.
>>23525 no yeah it was the api. Assemblyai is too slow and aws transcribe requires that it be uploaded ot a bucket so I guess that leaves google transcribe.
>>23528 I'm using aws transcribe since google won't take my card and aws demands that you upload the file to an s3 bucket and then outputs as a json. Really this has been exhausting...
>>23528 >Assemblyai Garbage website, which doesn't allow me to register. Just doesn't, without an error message. >>23530 >aws transcribe Did anyone here try to use Whisper? Or are your GPUs so weak, that you can't? Though, I think small models run on a SBC. That said, if that's not about programming languages, then we're OT here. We have a thread on Speech generation which is unofficially also about speech recognition: >>199 or the thread on NLP >>77 which might be the best suited for the topic of speech recognition.
>>23533 fuck off redditor. You're the problem
Open file (168.28 KB 849x458 26251282793756.gif)
>>23534 sirs this is a language thread do 1 thing and move thread do the needful thenks
>>23533 Its not a resource issue as much as making it more straightforward issue. I think aws transcribe and polly is more straightforward than running it locally and I think people that will want to talk to the waifu bot would be better off using that either way. Keep in mind the waifu will do most of its computing wirelessly on the computer since I'm not going to try to fit a gpu on it. Never mind the assembly ai though, its in the python code but I ended up ditching it for aws transcribe on the bash code. Consider the python one abandoned.
>>23534 Point taken, but if you're posting nothing more substantial than an ad hominem then please at least wait to do so for a post with the 'evidence' directly to hand in it. Less confusing that way. TIA Anon. :^) >>23536 >that ad though Lol. They just don't make the classics like that any more. Please to where I can log into Mr. NagoorBabu's important classes immediately? 1 free Internets to whoever can adapt something close to that into a robowaifu-centric banner gif for us (>>252). :DDD
>>23540 I don't think that's funny. I'm out of here. >see you tomorrow No really i think i had enough of this nonsense.
>>23541 Okay i don't want to be to rash but really if you could tone down the white supremacist thing I'd appreciate it. Cause really this has nothing to do with it.
>>23550 If you're referring to me Anon, in no way am I espousing a 'white supremacist thing', I'm simply stating the actual situation in the real world. Neither will I 'tone it down', if by that you actually mean 'don't ever mention that again'. We're 'in favor' of all men here (even our obvious enemies like those working for the GH). Their gonads are what's important in this matter, not their heritage. :^) Clearly you're angry, and while I get that, you're not really contributing much to the welfare of the little community here by your regular REE'g at others. Maybe you can tone down the complaining Anon? TIA.
>>23551 Really there is line between having thin skin and putting with a constant barrage of polfaggotry. But okay.
>>23552 Lol. I frankly consider your skin to be the thin one. I don't care if you're racist or not -- it's completely irrelevant to me. I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. A good way to not be destructive is to, and I quote, "Don't be a dick". You're being a dick Anon. :^) >Conduct on /robowaifu/ >-First, the two basic rules of the board >> 1 : Spoiler NSFW content. >> 2 : Don't promote either feminism or literal faggotry outside of The Basement Lounge. >https://alogs.space/robowaifu/rules.html >-Second, and simply; don't be a dick. >> "And as you wish that others would do to you, do so to them." >> -t. Jesus Christ >https://biblehub.com/luke/6-31.htm (>>3)
>>23552 To me the issue is trolling and that includes nonsense like attacking people for using Reddit and Discord, or creating OT debates. We had this issue a while ago. Until we got a voluntary, which then started to ban everyone who he didn't like or in his mind shouldn't be here. Which turned out to be more or less everyone. The poster in >>23550 didn't state what his problem was. If it was "raycist jokes" then that's the weakest argument and we don't know who he is, so who cares. This board has it's roots on 4chan and later 8chan. It's better to have some jokes which some people find "racist" than having people being offended by that taking over. That said, >>23536 was directed at me, and I'm not Indian nor programming Java, though I may pick up Kotlin or Clojure one day. So aside from the picture being funny, it's pretty stupid. I kinda ignored this, but I'm suspicious about people complaining about it may be the same or of the same group, trying to disrupt our conversations here with shitpostings and flame wars (between sock puppets if necessary). This even includes bringing this shitty thread to the top again and again... though others might do it without bad intentions. The latest topic here was Whisper and people using online services for speech, which is OT in this thread. Using online services for speech recognition is also generally discouraged if used beyond prototyping, which at least should be mentioned. Before that it was once again a discussion about C++ vs Python. This thread is not the best one, I'm considering hiding it again, so that I don't have to see it. >>23553 >I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. This. But you might giving it too much leeway already.
>>23557 >But you might giving it too much leeway already. Trust me, I understand your view NoidoDev. You mentioned the debacle before. I waited long on that in hopes of reconciliation as it's the right thing to do(tm). :^) That's just generally my moderation style, and even when I do act, its usually just to quarantine the posts in question (if they have any redeeming qualities at all, such as humor :^).
Open file (14.29 KB 512x512 dman_ai_1.png)
Open file (57.32 KB 673x496 pyd.png)
I don't see my favorite language has here, Let me introduce a great programing language that most people do not know about, The D programming language. First I'll give the buzzword filled blurb that most languages do: D is a safe static typed multi paradigm systems language with a C-like syntax. Why do I think this language is a good fit here, its a very productive language especially for a single developer. One undervalued aspect is that its not a "big agenda" language where it's focused on one design aspect at the price of all others and instead focuses on being generally productive. Having come from C++, the super fast compile times and saner design are such a breath of fresh air, D is to C++ what Go is to C. But Simply describing D as nicer C++ is not doing it justice, it has aspects that make it stand on it's own. D has the best Template system, Compile time introspection, function execution and code generation I have experienced in any language. It’s trivial to take an input and to generate d code at compile time, there is no need for complicated build scripts that generate source files as separate steps or any nasty text preprocessors. Using a library called pyd you can get nice Python & D interop, this is one of the reasons why I think D is a good fit for robowaifu. Another awesome feature is the GC, I know C&C++ people will roll there eyes, but I’d argue it’s actually really good for being productive. Unlike a lot of languages with a GC, its not forced using it is optional, C&C++ style manual memory management is a valid approach. D provides the @nogc attribute to ensure at compile time a function will never trigger the GC, Anyone writing high performance code should already be profiling and if your inside a hot spot as long as you don’t allocate GC memory you will have the same performance as C or C++. Finally there is the safety aspect, I am not going to argue that D is equal to Rust in this aspect, it’s not. But D does gives good tools to help ensure memory safety. D naturally has less foot guns then C&C++ and the @safe D subset is nice. I won't go into more detail, here is a link https://dlang.org/blog/2022/06/21/dip1000-memory-safety-in-a-modern-system-programming-language-pt-1 I will not pretend that D is perfect, it has its problems. It’s not a popular language, It’s community is small. Phobos (the std lib) is not perfect for @nogc code. Etc.. I can elaborate if anyone wants. I try to approach programing language discussions by talking about positives and not attacking other languages, but I will just say that, I bring up D as a counter to C, GO, C++ & Rust. I have used C++ for a long time, I have used Java & node. I have tried out GO and Rust. Go gets a lot right, I love it's fast compile times, it’s not a bad language, but it’s made for a corporate/large environment with a large code base and an army of expendable employees. Keeping it simple is very useful, it ensures the code base is maintainable over time even as people come and go. But for a small group of developers or an individual its restrictiveness is not helpful. One graybeard template wizard can be more productive than 10 Java monkeys. Then there is the Rust meme, the language has good ideas, and has had a good impact on language design, it has gotten more programmers thinking about memory safety. But it’s not the only important factor. What ultimately killed rust for me was how slow working in it is & I am not talking about borrow checker errors. I’m talking the slow compile times and the rigidness that builds up as your project grows, it’s often very hard to refactor a Rust (and C++) project. In my experience D while not perfect is actually really good at making projects enjoyable to work on. I don't think anyone here is getting paid so developer enjoyment becomes important for motivation.
>>24667 >I don't see my favorite language has here Brilliant, I just noticed the typo within the first few words, despite reading the post before posting -_- I hope people still read this and don't write me off, I'm not a complete retard. it should be "I don't see my favorite language here"
Open file (91.82 KB 736x552 chii_ponders_2.jpg)
>>24667 >>24668 Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) Thanks for your inputs! Alexandrescu is a brilliant man, and has helped the C++ community a lot. D is a great idea of his (and could potentially have supplanted C++ very well), but unfortunately it can't really make much practical headway in the real-world systems programming domain (that is, a world completely dominated by the C & C++ programming languages) to effect real change. This is probably why he himself abandoned his own language a few years back. >G* & R*st Both have some good ideas, stole their concepts from proven systems already written in C & C++, and -- particularly in the case of R*st -- their communities are, roughly speaking, the equivalent of a pride parade down the middle of some conservative Technology Main Street town that doesn't care about or want any faggots around. >tl;dr It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex is not at all helpful either... for anons trying to create unencumbered, opensauce robowaifus to serve men the world over. >tl;dr Both languages are far too toxic and problematic to be of much use to us here tbh. --- Glad you stopped in Anon. I think your pic is a cute one -- your robowaifu a cute! :^) BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here.
>>24667 probably the only language that got inline asm right, nice and simple without all the goddamn fluff and insanity like making every line a string or giving variables by name but then only being allowed to use them as a number represented by the order in which you listed them oh and dont forget to specify if its a global/memory/pointer/register/etc because clearly the compiler cant just fucking figure it out from the declaration d seems good if you want to mix high level and low level code, like actually mix not just using external functions
>>24669 >Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) I don't mind, feel free to do that. >This is probably why he himself abandoned his own language a few years back. Alexandrescu did not abandoned D, He stepped back because he prioritized his family over a programming language (very based), here is him saying that with his own words (with more details) (1). He is still with us, he actively participates in the D foundation and attends the meetings and he is absolutely still around. If you listen to the rumor mill and read the doom posting within the D community you can get an impression that the language "is dead" or "dying" or that leadership sucks. But that is not the case, the people at the top are very skilled programmers. That being said, the D heads have social skills of programmers and not "people" people. Some don't like, but I love it, the top is not made up of "Public Relations" do nothings. The D community is very self-critical, often to a determinantal level. You will see DIPs (proposals for D) "die" & then people will complain and say D is dying or that the language is stagnating, and yet with new releases I find quality progress that no one will celebrate. If you want a change in D, write a quality pull request and you will find that its actually not hard to get stuff into D, what is very hard is to get others to do that for you. Don't be an "idea guy". If you use the D standard library you will be running code I wrote, this how I know this is true. >It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex I did not want to get into bashing language, but why not. I 100% agree, the Rust community is the worst group of people I have had the displeasure of ever interacting with. Rust has its roots in Mozilla, that alone is a red flag. The culture at Mozilla is some of the most extreme mental illness on the planet. Here is an article I have ran into that does a good job showing how horrid Mozilla is (2) & why you should never give them a cent. If you use firefox, please use Librewolf (3), that's my daily driver. Do not support Mozilla, they actively fund left wing extremists that hate you. Another thing I will bring up is Actix web drama, when the community was such aids over the unsafe keyword that it pushed the developer to quit. If I was in that position I too would say that "I am done with open source". >BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here. Sure, I'll make a post & say hi. tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) 1: https://www.youtube.com/watch?v=cpTAtiboIDs&t=3049s 2: https://lunduke.locals.com/post/4387539/firefox-money-investigating-the-bizarre-finances-of-mozilla 3: https://librewolf.net/
>>24680 >he actively participates in the D foundation and attends the meetings and he is absolutely still around. Excellent! This is very good news to hear Anon. D is a very nice language and would be a good candidate for a robowaifu systems language as I've pointed out here before. But the fact that it was (by all appearances) abandoned by it's #1 developer pretty much disqualified it from current consideration. Also, it's not yet an international standard (and therefore quite-susceptible to Globohomo-incited corruption). I know Andrei works closely with the members of the C++ Standards Committee, do you think it likely that D will be made into an ISO international standard anytime soon? That would be a tremendous boost to legitimately investigating using it as the primary robowaifu systems programming language. >If you use the D standard library you will be running code I wrote, this how I know this is true. Neat! I've often toyed with the idea of joining the C++ Standards Committee myself, and I'm exactly the type that Bjarne Stroustrup is clamoring for to do so. That is, I'm an end user, and not a corporate-interest shill. >Do not support Mozilla, they actively fund left wing extremists that hate you. Indeed. All the Filthy Commie cohorts, and most of the general Leftist ones, hate what /robowaifu/ and it's cadres stand for with a passion. This is as they have been programmed by the Globohomo indoctrinations to be of course -- all by design. Marxism is a cult religion for these golems; we here are in flat opposition to all that by empowering individual men against their machine. :^) >tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) Yes, we are all of those things Twilight. Looking forward to your Embassy post. Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 08/21/2023 (Mon) 00:56:03.
>>24687 >do you think it likely that D will be made into an ISO international standard anytime soon? No there is no interest in that at the moment, I do not see that happening soon.
>>24762 OK, fair enough. It's a big commitment, and can also be incredibly-convoluted as a formal process. On the plus side however, being an international ISO standard makes any language much more robust in the face of opposition to """controversial""" uses -- such as creating robowaifus! :^) >=== -prose edit
Edited last time by Chobitsu on 08/24/2023 (Thu) 14:02:31.
>>25020 First talk is a skip
>>25020 Thanks Anon, I'll check this year's event out at some point soon.
Dconf has ended, I have commented the time stamps for each talk (except day 1, someone else did that). For people who use the SponsorBlock addon, I have submitted the Intermissions. https://www.youtube.com/watch?v=uzuKqiFVNZM https://www.youtube.com/watch?v=wXTlafzlJVY https://www.youtube.com/watch?v=CMrb6ZWhqXs For more information on the talks the schedule is here https://dconf.org/2023/
>>25102 Excellent. Thanks for all the hard work, Twilight. So, one of the language's important principals is closely associated with Ucora. Can you give me any further insights on them beyond link-related, Anon? https://www.ucora.com/about/
>>25103 Here is the forum post that announced there involvement with the D foundation, that should hopefully give you an idea of what the relationship is and what they are doing for the foundation. https://forum.dlang.org/post/avvmlvjmvdniwwxemcqu@forum.dlang.org Hope this answers your question
>>25126 Thanks for the link Anon, I'll peruse it.
So I learned something different that could be very significant for robust error free waifus. I'm looking at Ada language for kicks. I start following links and find something I didn't know. Adacore which is connected with Adafruit a company that makes lots of different single board computers and electronics parts is really into Ada. Especially Spark programming language. Spark is a subset of Ada used to trim down and make it even more safe. The idea being that you have to declare everything and the compiler catches most every mistake to leverage out bugs. Hackaday has an article on this. Some say it;s BS but a lot say that Spark, and Ada, really crush the likelihood of bugs. There's no doubt that lots of things that just have to work, space shuttles, space stations, F-22, F-15 and lots of medical equipment use Ada or Ada Spark and have strong protections against surprise bugs. https://hackaday.com/2019/09/10/why-ada-is-the-language-you-want-to-be-programming-your-systems-with/ I found a video where Nvidia says they are going to move to all spark programming because of so many of their chips used in task critical areas like self driving. https://www.youtube.com/watch?v=2YoPoNx3L5E Here's a link on learning Ada and Spark, https://learn.adacore.com/ In the link below it gives a huge number of mission critical items that used Spark. https://en.wikipedia.org/wiki/SPARK_(programming_language) While Spark and Ada may not be glamorous or crafty like LISP they often work exactly as programmed without surprises the first time. Worth thinking about. I wonder are there tools that will accomplish the same thing with C++ or other languages without using Ada or Spark? https://www.hackster.io/adacore
>>25214 You know I think a lot of you Grommet, so please don't take this personally, but > Adafruit They are a great iconization of Leftists in general, and will be no friends to /robowaifu/ and our ilk once the 'rubber meets the road', so to speak. The name itself should be a big tipoff. Simply research the founding/history of the organization, Anon. Use them; that's fine as long as you take everything related to them with a big grain of salt. I'd personally much prefer Newark or Black Box for supply of electronics, etc. Not only are they much more well-established in the engineering domains, they are much less likely to be completely overrun by Filthy Commies. :^) > Ada I've mentioned the benefits of Ada numerous times here on the board; also why I feel it is primarily-disqualified for us. The hardware onboard aircraft doesn't 'magically' get safety just because Ada is being used there. It still requires strict engineering discipline and rigorous testing (at huge costs: Ada is a very expensive language to use in general). And no system is 100% safe. > I wonder are there tools that will accomplish the same thing with C++ Both D and C++ can be written in highly safe ways. However D has other challenges for us here, so I can't recommend it (at least yet). C++ has a very strong impetus for a pared-down, sensible/sane usage of the language known as the C++ Core Guidelines [1], which I intend to teach some of the most important basics of in our secondary classes here (including automated tools for checking CPPCG compliance). 1. https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines >=== -prose edit
Edited last time by Chobitsu on 09/05/2023 (Tue) 13:45:46.
>>25215 >don't take this personally No. I had heard of them(...a great iconization of Leftists...)but actually came at them through Ada. I found that the adafruit people somehow have some sort of affiliation, however tenuous, with Ada which I didn't know about. They link each other. The people that did Ada I think made it too much by committee. The Ada fanatics say the newer versions have fixed a lot of this. They have further fixed this, I believe, by paring it down to a subset called Spark. Probably the most retarded name for for software ever. How many things are called Spark? Way too many. When you say Ada is "expensive" I think of it totally differently. It's well thought out, takes a whole approach to things and is not a hacked up abomination that C is. (I've been reading the "Unix Haters Handbook" again) If these people had real smarts they would make a good higher level Spark program, like they have, then they would get some sort of miracle C/C++ hacker and make an extensive C/C++ program that would test all the basics of the registers and all other functions of a processor or microprocessor, (in C/C++), and then just read what it compiled to. Then equate the complied machine code results to Spark commands and back compile a compiler for every processor they could find while leaving the high level Spark tools and programming syntax in place. There's not enough compilers for Spark for all the processors and making them is not easy. Why they don't use computers to do the things they are supposed too, drudgery work, and automate it. I have no idea. Likely because the Ada people hate C and the C people hate Ada so nothing good ever happens. Worse is better. I fully, I believe, understand why you like C/C++ so much. It gives you complete control but unless you have years of experience it can cause all sorts of grief. Stuff like Spark and Ada are essentially boring as they do things the right way. Ever heard, I know you have, the https://en.wikipedia.org/wiki/Worse_is_better That's what C/C++ is, worse. Let's note that while C/C++ might be easy to whip something up. All the time you spent with all the gotchas and memorizing all the gotchas you could have spent a little more time with something like Ada and got it right in the first place. C/C++ only seem easier after spending years hacking away at it. Stuff like Ada make take longer to set up but only because you have to get it right or it just bleeps at you and tells you, you fucked up. C will take whatever you throw at it and promptly throw up bits on your keyboard.
>>25219 > conflating the ultra-pozz den Adafruit, with the 40+yo US DARPA/DoD overarching, sprawling, mandated, programming language (rapidly 'replaced' by the MIC industry using C & C++ instead, b/c of Ada's intractability fail -- especially during the first couple decades of the enforced debacle). < shiggy > I fully, I believe, understand why you like C/C++ so much. I don't 'like' C++ except insofar as it is literally the #1 most-likely means for /robowaifu/ 's successful & complete, systems software solutions; to give all of us entirely-unencumbered, practical, realworld robowaifus. > That's what C/C++ is, worse. With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly (cf., search your own use of the term 'worse': ITT ). By all means Anon, knock yourself out. You've expressed an interest in learning how to program software. Go ahead and tackle Ada as your language of choice. GCC offers a great front-end for the language via their amazing compiler system. [1] After enough experience, you too will understand why so few engineers willingly adopt the language on a personal level in actual production systems, and why it is literally one of the most expensive approaches to software development known today. > tl;dr Let's give any further programming language discussions between us a miss, friend Grommet. If you'd like to participate in our own /robowaifu/ programming classes here, and as long as you stay on-topic (ie, C++ programming, etc.), then of course that's an exception to this mandate regarding yourself. Cheers Anon, and kind regards. :^) 1. https://gcc.gnu.org/wiki/GNAT >=== -prose edit
Edited last time by Chobitsu on 09/07/2023 (Thu) 18:03:07.
>>25243 >With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly Fair enough and a fair criticism. I'll try to take it to heart. I do make mistakes and also try to admit it when I do. My apologies.
I was looking into Elixir (lang) for creating my implementation of the cognitive architecture. That said, I keep an open mind and found this here for C++: https://github.com/endurox-dev/endurox https://en.wikipedia.org/wiki/Enduro/X >Enduro/X is an open-source middleware platform for distributed transaction processing. It is built on proven APIs such as X/Open group's XATMI and XA. The platform is designed for building real-time microservices based applications with a clusterization option. Enduro/X functions as an extended drop-in replacement for Oracle Tuxedo. The platform uses in-memory POSIX Kernel queues which insures high interprocess communication throughput. > ... AGPL It also has Python bindings. Hmm. Anyone worked with this?
>>25219 >>25214 I’m a little late (don’t have a lot of time to be checking robowaifu) but I think I can say something of value for Grommet. While good languages can have less foot guns and be designed to make mistakes harder. A language should not be chosen "for being safer" or for any other memed purpose. Are you picking Ada because it solves problems for you or are you picking it because it’s "safe and runs on my heckin airplanes TM"? >but unless you have years of experience it can cause all sorts of grief. This is true, that’s why I advice starting now and start clocking in hours and gain experience and confidence. Even if the language stops memory bugs, there are so many other mistakes can and will make. You can write good code in almost any language, Just pick a language and start writing, try to learn the language as well as you can, don’t be stuck in analysis paralysis. I guess there is some irony in me saying this in a thread called "Selecting a Programming Language" but I think it’s the most helpful thing to say. If you want safety I recommend you take a proven approach, Test everything!!! you need 100% coverage and the tests need to be good and not overfited. SQLite is a perfect example of this. It’s very reliable, runs on le hecking airplanes & is bug free while being written in pure C. it’s not a stagnant codebase of untouchable C, because the tests are good, they can afford to do large rewrites between even minor releases.
I'm played around with a rather niche language today: Lobster. It has a pythonic syntax and some ideas similar to Rust in regards to borrowing. So it might be useful for doing something that needs to be secure but easy to write. It's also quite fast, can compile to C++ as well this code then can be used for WASM (binary in the browser). The language was mainly meant for creating games, so it has a strong focus on graphics (OpenGL). I'm experimenting in a direction towards using the frames of a video made with "SadTalker" like I posted here >>26029, to have a fast way to make an avatar talk without rendering it with an AI model every time. Imagine a small program that takes in a stream of text and creates speech with the right movements of the lips, not looking human-level "realistic" but reasonably well and without any relevant lag. So for I managed to make it load images and call the text-to-speech service on Linux. Both works very fast. So fast, that I also can overlay more than one of those, so this might help with making it smooth. To make this work at some point, I will need to make a picture sequence (frames) for each syllable or something similar, and probably also with some combinations of how the head is positioned. Then the program would load those, based on the text input while also creating text to speech output. This can help to investigate animated virtual girlfriends as predecessor of robowaifus. I also imagine this to be useful for AI creating simple simulations. I don't know how to use that yet, but I have some ideas, and here's a website that inspired me: https://concepts.jtoy.net - I think about it so, that training a model to recognize a pattern in a video input (her own vision, but maybe also watching TV or a game) and match what is going in with one of those concepts. https://github.com/aardappel/lobster https://aardappel.github.io/lobster/builtin_functions_reference.html https://github.com/OpenTalker/SadTalker/tree/main
>>26092 Neat! This seems like a really promising investigation Anon. I certainly agree that Visual Waifu development, if done well, will probably clean about 50% of the tasks across the board from our table toward making great robowaifus. Please keep us up to date, NoidoDev. Cheers. :^)
>>26093 I'm struggling a bit with the Lobster language, since I'm not used to compiled languages and it is not very well documented. I tried the same thing in Python with help from chatGPT. Dev time was way shorter, only like half an hour, it takes much longer to start the program though. Now I'm looking into compressing the frames somehow by removing common areas between two images. The frames are circa 80x of the size of a short video (450MB, 256 colors).
This is VERY COOL. A "Scratch" like visual programming for micro-controllers. ESP32 included. Has a multitasking OS using byte code. OS is 16k. Can run n a browser or downloaded program. Can be used to test programs in a virtual micro-controller on the web or, I think, built in the program. http://microblocks.fun/ I've been reading a lot of Alan Kay's stuff so this makes sense.
>>26245 Okay. Not sure how much one could do with that and if it was suitable for doing something related to robowaifus, but it might come in handy.
>>26245 Neat! Thanks Grommet (good to see you BTW). I love Scratch, and think it's kind of visual interface will be highly valuable as a scripting interface for us, once robowaifus have become commonplace enough for Joe Sixpack's to be clammoring for their own.
Python Sucks And I LOVE It | Prime Reacts https://www.youtu.be/8D7FZoQ-z20 tl;dw: Execution speed is only one thing, development speed matters often more, and getting things done even more. Builtin types are in C anyways. Parts of the code can be transferred to Cython. Not mentioned: Mojo is around the corner, and based on Python.
Edited last time by Chobitsu on 11/08/2023 (Wed) 03:42:12.
>>26260 >Mojo is around the corner, and based on Python. Yeah, I'm curious if Modular will ever decide to end their commercial goals surrounding Mojo, and release it free as in speech to the world with no strings attached. It would be a shame for anons to get mired into some kind of Globohomo-esque tarbaby trap with something as vital to our robowaifus as her operational software. >=== -prose edit
Edited last time by Chobitsu on 11/08/2023 (Wed) 03:55:26.
Mojo was mentioned but no mention of Julia? I haven't had experience in either but Julia seems like it might be better for more people here that already know Python. https://www.datacamp.com/blog/julia-vs-python-which-to-learn https://exploreaiworld.com/mojo-vs-julia However when I check on some people doing benchmarks it's not uncommon for Mojo to be faster perhaps because it takes more effort to optimize Julia code or they just had a flawed method of comparison though it is possible Mojo is actually faster. However Mojo isn't even open source currently though as far as I'm aware and may never actually be, but Julia always was open source and has been around a while now. In either case both are faster than Python when dealing with large datasets as far as I can tell. Julia can call on languages like C, C++ Python and Fortran libraries. Am I mistaken or isn't this likely better than Mojo and Python? Python being popular doesn't mean it's the best, it's just a general purpose language with a simple syntax so a lot of programmers know it but the people in ML research are adapting Julia more recently at an increasing rate although it's not anywhere near popular even there just yet.
Great, newbie-friendly introduction to ASM programming [1] by one of the self-taught masters, Matt Godbolt; creator (from scratch!) of the world-class programming analysis tool, Compiler Explorer [2]. Highly recommended video for anyone trying to understand what assembler/machine-code programming is. Good investment of 20-minutes for robowaifuists, Anon! :^) >note: Also the short, old book he mentions at the beginning of the video is available on Wayback. [3] --- 1. https://www.youtube.com/watch?v=8VsiYWW9r48 2. https://godbolt.org/ 3. https://archive.org/details/machine-code-for-beginners >=== -minor edit -add 'Wayback' hotlink
Edited last time by Chobitsu on 04/09/2024 (Tue) 03:00:28.
>>30828 Thanks, I'll watch this when I also have time to play the "Human Resource Machine" which is a learning game for that.

Report/Delete/Moderation Forms
Delete
Report