/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Site was down because of hosting-related issues. Figuring out why it happened now.

Build Back Better

Sorry for the delays in the BBB plan. An update will be issued in the thread soon in late August. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“I am not judged by the number of times I fail, but by the number of times I succeed: and the number of times I succeed is in direct proportion to the number of times I fail and keep trying.” -t. Tom Hopkins


Selecting a Programming Language Robowaifu Technician 09/11/2019 (Wed) 13:07:45 No.128
What programming language would suit us and our waifus best? For those of us with limited experience programming, it's a daunting question.
Would a language with a rigid structure be best?
Do we want an object-oriented language?
How much do you care about wether or not a given language is commonly used and widespread?
What the fuck does all that terminology mean?
Is LISP just a meme, or will it save us all?

In this thread, we will discuss these questions and more so those of us who aren't already settled into a language can find our way.
>>23513 >I don't really think I have a case of either arrogance, nor foolishness If you read it that I was accusing you of this, then I didn't word it right because I in no way meant that. I'm fully, really fully on board with all you said. The whole entire, manufactured, culture we have is vile and evil. I'm not always so good at expressing what I mean, though in my defense most of these things are complicated. When I keep pushing other languages it's mostly not because of my not believing in C and C++ capabilities. It's because I do not think MY capabilities on such difficult languages are up to the task. I expect that I'm not the only one in this class, so I try to finds ways around this. I comment on what I find that can possibly make this task easier. I do believe that there are easier ways and better languages that can do the same thing that C++ does with small performance hits that with modern hardware amount to not so much. There's always the basic "the enemy of the best is the good enough". I'm looking for good enough some places and the best in others. (My hatred of seams in skin suits). Now the video I mentioned here >>23461 and linked here by NoidoDevhere, >>23486 I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. Now maybe he's full of it but if he can pull this off it will be very beneficial. One thing I really like is he is doing all the low level work. He is expressly saying things that I worry about. He says that he is looking at the low level, the physics of how processors work to get the best speed then he is building a framework to work with, and he is specific, on the thousands of different types of graphics, normal and specialty, processors. Then by making this framework and laying python over it you can stay in the higher level functions and still get high performance. Yes there are some additions. The gist I got of these so far is typing, and setting up the ownership of various variables instead of pythons allowing this to not be specified(for speed. There is more but I don't understand it yet). However his goal is that if you do not wish max performance it works like regular python. He's talking a year before they have a hard core package but some of it works now. Now for you this may be no big deal for you but for most, who are not C++ programmers. Being able to string together libraries to do task is super great stuff. This has real world consequences. had a idea. What does it cost to increase power? I lined a chart of processor power, in general, and cost from Tom's Hardware.(well this failed so you have to look it up yourself) Let's say you use the lowest cost $99 to get a waifu walking around. But then you want speech and maybe a few other functions. To double the power cost you about $300 more. Now what is the cost of the time to write all this code in C++ compared to stringing together some python libraries and using this guys Mojo to speed it up. When you start comparing the difference in time spent on this to real dollars, higher level languages start making a good deal of sense. And since the processor power keeps going up and cost lower then upgrades, while using higher level languages to upgrade without all this bit twiddling, become even more attractive. Now Moore's law may have slowed down in density of transistors I don't think it has slowed much on over all increases in power. They are paralleling things which is fine for our task. And even the so called "end" of Moore's law is premature according to Jim Keller, who is not some guy living under a bridge. I've seen talks where he lays out plans for a path to 50x gate density.
Well in this case the fact that it's made in python does matter cause it's pretty slow, although I'm not sure if it's the programming language or the api. I'm going to redo it in bash.
>>23524 >I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. I'm downloading a lot of such talks as audio, then using them like (audio) podcasts. The downside of this is, that making notes is hard while walking around and a bit of information in form of facial impressions and such is lost. Generally it's a good thing though if one has to wait somewhere or want's to walk around for some hours. Just make really sure to not walk into a car, if you try the same. You don't need to understand everything at once but replay it later, but also I recommend to use podcasts for programming beginners first. I can't give you a good hint here, since mine was in German (Chaosradio). I think I listened to English ones as well at some point, but don't remember a specific name. "Talk Python to me" podcast might also explain some details, but it's mostly about specific libraries. It can still help to listen to it, even if you won't use the library they talk about, since they explain why it's necessary and talk about Python in general.
>>23525 no yeah it was the api. Assemblyai is too slow and aws transcribe requires that it be uploaded ot a bucket so I guess that leaves google transcribe.
>>23528 I'm using aws transcribe since google won't take my card and aws demands that you upload the file to an s3 bucket and then outputs as a json. Really this has been exhausting...
>>23528 >Assemblyai Garbage website, which doesn't allow me to register. Just doesn't, without an error message. >>23530 >aws transcribe Did anyone here try to use Whisper? Or are your GPUs so weak, that you can't? Though, I think small models run on a SBC. That said, if that's not about programming languages, then we're OT here. We have a thread on Speech generation which is unofficially also about speech recognition: >>199 or the thread on NLP >>77 which might be the best suited for the topic of speech recognition.
>>23533 fuck off redditor. You're the problem
Open file (168.28 KB 849x458 26251282793756.gif)
>>23534 sirs this is a language thread do 1 thing and move thread do the needful thenks
>>23533 Its not a resource issue as much as making it more straightforward issue. I think aws transcribe and polly is more straightforward than running it locally and I think people that will want to talk to the waifu bot would be better off using that either way. Keep in mind the waifu will do most of its computing wirelessly on the computer since I'm not going to try to fit a gpu on it. Never mind the assembly ai though, its in the python code but I ended up ditching it for aws transcribe on the bash code. Consider the python one abandoned.
>>23534 Point taken, but if you're posting nothing more substantial than an ad hominem then please at least wait to do so for a post with the 'evidence' directly to hand in it. Less confusing that way. TIA Anon. :^) >>23536 >that ad though Lol. They just don't make the classics like that any more. Please to where I can log into Mr. NagoorBabu's important classes immediately? 1 free Internets to whoever can adapt something close to that into a robowaifu-centric banner gif for us (>>252). :DDD
>>23540 I don't think that's funny. I'm out of here. >see you tomorrow No really i think i had enough of this nonsense.
>>23541 Okay i don't want to be to rash but really if you could tone down the white supremacist thing I'd appreciate it. Cause really this has nothing to do with it.
>>23550 If you're referring to me Anon, in no way am I espousing a 'white supremacist thing', I'm simply stating the actual situation in the real world. Neither will I 'tone it down', if by that you actually mean 'don't ever mention that again'. We're 'in favor' of all men here (even our obvious enemies like those working for the GH). Their gonads are what's important in this matter, not their heritage. :^) Clearly you're angry, and while I get that, you're not really contributing much to the welfare of the little community here by your regular REE'g at others. Maybe you can tone down the complaining Anon? TIA.
>>23551 Really there is line between having thin skin and putting with a constant barrage of polfaggotry. But okay.
>>23552 Lol. I frankly consider your skin to be the thin one. I don't care if you're racist or not -- it's completely irrelevant to me. I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. A good way to not be destructive is to, and I quote, "Don't be a dick". You're being a dick Anon. :^) >Conduct on /robowaifu/ >-First, the two basic rules of the board >> 1 : Spoiler NSFW content. >> 2 : Don't promote either feminism or literal faggotry outside of The Basement Lounge. >https://alogs.space/robowaifu/rules.html >-Second, and simply; don't be a dick. >> "And as you wish that others would do to you, do so to them." >> -t. Jesus Christ >https://biblehub.com/luke/6-31.htm (>>3)
>>23552 To me the issue is trolling and that includes nonsense like attacking people for using Reddit and Discord, or creating OT debates. We had this issue a while ago. Until we got a voluntary, which then started to ban everyone who he didn't like or in his mind shouldn't be here. Which turned out to be more or less everyone. The poster in >>23550 didn't state what his problem was. If it was "raycist jokes" then that's the weakest argument and we don't know who he is, so who cares. This board has it's roots on 4chan and later 8chan. It's better to have some jokes which some people find "racist" than having people being offended by that taking over. That said, >>23536 was directed at me, and I'm not Indian nor programming Java, though I may pick up Kotlin or Clojure one day. So aside from the picture being funny, it's pretty stupid. I kinda ignored this, but I'm suspicious about people complaining about it may be the same or of the same group, trying to disrupt our conversations here with shitpostings and flame wars (between sock puppets if necessary). This even includes bringing this shitty thread to the top again and again... though others might do it without bad intentions. The latest topic here was Whisper and people using online services for speech, which is OT in this thread. Using online services for speech recognition is also generally discouraged if used beyond prototyping, which at least should be mentioned. Before that it was once again a discussion about C++ vs Python. This thread is not the best one, I'm considering hiding it again, so that I don't have to see it. >>23553 >I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. This. But you might giving it too much leeway already.
>>23557 >But you might giving it too much leeway already. Trust me, I understand your view NoidoDev. You mentioned the debacle before. I waited long on that in hopes of reconciliation as it's the right thing to do(tm). :^) That's just generally my moderation style, and even when I do act, its usually just to quarantine the posts in question (if they have any redeeming qualities at all, such as humor :^).
Open file (14.29 KB 512x512 dman_ai_1.png)
Open file (57.32 KB 673x496 pyd.png)
I don't see my favorite language has here, Let me introduce a great programing language that most people do not know about, The D programming language. First I'll give the buzzword filled blurb that most languages do: D is a safe static typed multi paradigm systems language with a C-like syntax. Why do I think this language is a good fit here, its a very productive language especially for a single developer. One undervalued aspect is that its not a "big agenda" language where it's focused on one design aspect at the price of all others and instead focuses on being generally productive. Having come from C++, the super fast compile times and saner design are such a breath of fresh air, D is to C++ what Go is to C. But Simply describing D as nicer C++ is not doing it justice, it has aspects that make it stand on it's own. D has the best Template system, Compile time introspection, function execution and code generation I have experienced in any language. It’s trivial to take an input and to generate d code at compile time, there is no need for complicated build scripts that generate source files as separate steps or any nasty text preprocessors. Using a library called pyd you can get nice Python & D interop, this is one of the reasons why I think D is a good fit for robowaifu. Another awesome feature is the GC, I know C&C++ people will roll there eyes, but I’d argue it’s actually really good for being productive. Unlike a lot of languages with a GC, its not forced using it is optional, C&C++ style manual memory management is a valid approach. D provides the @nogc attribute to ensure at compile time a function will never trigger the GC, Anyone writing high performance code should already be profiling and if your inside a hot spot as long as you don’t allocate GC memory you will have the same performance as C or C++. Finally there is the safety aspect, I am not going to argue that D is equal to Rust in this aspect, it’s not. But D does gives good tools to help ensure memory safety. D naturally has less foot guns then C&C++ and the @safe D subset is nice. I won't go into more detail, here is a link https://dlang.org/blog/2022/06/21/dip1000-memory-safety-in-a-modern-system-programming-language-pt-1 I will not pretend that D is perfect, it has its problems. It’s not a popular language, It’s community is small. Phobos (the std lib) is not perfect for @nogc code. Etc.. I can elaborate if anyone wants. I try to approach programing language discussions by talking about positives and not attacking other languages, but I will just say that, I bring up D as a counter to C, GO, C++ & Rust. I have used C++ for a long time, I have used Java & node. I have tried out GO and Rust. Go gets a lot right, I love it's fast compile times, it’s not a bad language, but it’s made for a corporate/large environment with a large code base and an army of expendable employees. Keeping it simple is very useful, it ensures the code base is maintainable over time even as people come and go. But for a small group of developers or an individual its restrictiveness is not helpful. One graybeard template wizard can be more productive than 10 Java monkeys. Then there is the Rust meme, the language has good ideas, and has had a good impact on language design, it has gotten more programmers thinking about memory safety. But it’s not the only important factor. What ultimately killed rust for me was how slow working in it is & I am not talking about borrow checker errors. I’m talking the slow compile times and the rigidness that builds up as your project grows, it’s often very hard to refactor a Rust (and C++) project. In my experience D while not perfect is actually really good at making projects enjoyable to work on. I don't think anyone here is getting paid so developer enjoyment becomes important for motivation.
>>24667 >I don't see my favorite language has here Brilliant, I just noticed the typo within the first few words, despite reading the post before posting -_- I hope people still read this and don't write me off, I'm not a complete retard. it should be "I don't see my favorite language here"
Open file (91.82 KB 736x552 chii_ponders_2.jpg)
>>24667 >>24668 Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) Thanks for your inputs! Alexandrescu is a brilliant man, and has helped the C++ community a lot. D is a great idea of his (and could potentially have supplanted C++ very well), but unfortunately it can't really make much practical headway in the real-world systems programming domain (that is, a world completely dominated by the C & C++ programming languages) to effect real change. This is probably why he himself abandoned his own language a few years back. >G* & R*st Both have some good ideas, stole their concepts from proven systems already written in C & C++, and -- particularly in the case of R*st -- their communities are, roughly speaking, the equivalent of a pride parade down the middle of some conservative Technology Main Street town that doesn't care about or want any faggots around. >tl;dr It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex is not at all helpful either... for anons trying to create unencumbered, opensauce robowaifus to serve men the world over. >tl;dr Both languages are far too toxic and problematic to be of much use to us here tbh. --- Glad you stopped in Anon. I think your pic is a cute one -- your robowaifu a cute! :^) BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here.
>>24667 probably the only language that got inline asm right, nice and simple without all the goddamn fluff and insanity like making every line a string or giving variables by name but then only being allowed to use them as a number represented by the order in which you listed them oh and dont forget to specify if its a global/memory/pointer/register/etc because clearly the compiler cant just fucking figure it out from the declaration d seems good if you want to mix high level and low level code, like actually mix not just using external functions
>>24669 >Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) I don't mind, feel free to do that. >This is probably why he himself abandoned his own language a few years back. Alexandrescu did not abandoned D, He stepped back because he prioritized his family over a programming language (very based), here is him saying that with his own words (with more details) (1). He is still with us, he actively participates in the D foundation and attends the meetings and he is absolutely still around. If you listen to the rumor mill and read the doom posting within the D community you can get an impression that the language "is dead" or "dying" or that leadership sucks. But that is not the case, the people at the top are very skilled programmers. That being said, the D heads have social skills of programmers and not "people" people. Some don't like, but I love it, the top is not made up of "Public Relations" do nothings. The D community is very self-critical, often to a determinantal level. You will see DIPs (proposals for D) "die" & then people will complain and say D is dying or that the language is stagnating, and yet with new releases I find quality progress that no one will celebrate. If you want a change in D, write a quality pull request and you will find that its actually not hard to get stuff into D, what is very hard is to get others to do that for you. Don't be an "idea guy". If you use the D standard library you will be running code I wrote, this how I know this is true. >It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex I did not want to get into bashing language, but why not. I 100% agree, the Rust community is the worst group of people I have had the displeasure of ever interacting with. Rust has its roots in Mozilla, that alone is a red flag. The culture at Mozilla is some of the most extreme mental illness on the planet. Here is an article I have ran into that does a good job showing how horrid Mozilla is (2) & why you should never give them a cent. If you use firefox, please use Librewolf (3), that's my daily driver. Do not support Mozilla, they actively fund left wing extremists that hate you. Another thing I will bring up is Actix web drama, when the community was such aids over the unsafe keyword that it pushed the developer to quit. If I was in that position I too would say that "I am done with open source". >BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here. Sure, I'll make a post & say hi. tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) 1: https://www.youtube.com/watch?v=cpTAtiboIDs&t=3049s 2: https://lunduke.locals.com/post/4387539/firefox-money-investigating-the-bizarre-finances-of-mozilla 3: https://librewolf.net/
>>24680 >he actively participates in the D foundation and attends the meetings and he is absolutely still around. Excellent! This is very good news to hear Anon. D is a very nice language and would be a good candidate for a robowaifu systems language as I've pointed out here before. But the fact that it was (by all appearances) abandoned by it's #1 developer pretty much disqualified it from current consideration. Also, it's not yet an international standard (and therefore quite-susceptible to Globohomo-incited corruption). I know Andrei works closely with the members of the C++ Standards Committee, do you think it likely that D will be made into an ISO international standard anytime soon? That would be a tremendous boost to legitimately investigating using it as the primary robowaifu systems programming language. >If you use the D standard library you will be running code I wrote, this how I know this is true. Neat! I've often toyed with the idea of joining the C++ Standards Committee myself, and I'm exactly the type that Bjarne Stroustrup is clamoring for to do so. That is, I'm an end user, and not a corporate-interest shill. >Do not support Mozilla, they actively fund left wing extremists that hate you. Indeed. All the Filthy Commie cohorts, and most of the general Leftist ones, hate what /robowaifu/ and it's cadres stand for with a passion. This is as they have been programmed by the Globohomo indoctrinations to be of course -- all by design. Marxism is a cult religion for these golems; we here are in flat opposition to all that by empowering individual men against their machine. :^) >tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) Yes, we are all of those things Twilight. Looking forward to your Embassy post. Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 08/21/2023 (Mon) 00:56:03.
>>24687 >do you think it likely that D will be made into an ISO international standard anytime soon? No there is no interest in that at the moment, I do not see that happening soon.
>>24762 OK, fair enough. It's a big commitment, and can also be incredibly-convoluted as a formal process. On the plus side however, being an international ISO standard makes any language much more robust in the face of opposition to """controversial""" uses -- such as creating robowaifus! :^) >=== -prose edit
Edited last time by Chobitsu on 08/24/2023 (Thu) 14:02:31.
>>25020 First talk is a skip
>>25020 Thanks Anon, I'll check this year's event out at some point soon.
Dconf has ended, I have commented the time stamps for each talk (except day 1, someone else did that). For people who use the SponsorBlock addon, I have submitted the Intermissions. https://www.youtube.com/watch?v=uzuKqiFVNZM https://www.youtube.com/watch?v=wXTlafzlJVY https://www.youtube.com/watch?v=CMrb6ZWhqXs For more information on the talks the schedule is here https://dconf.org/2023/
>>25102 Excellent. Thanks for all the hard work, Twilight. So, one of the language's important principals is closely associated with Ucora. Can you give me any further insights on them beyond link-related, Anon? https://www.ucora.com/about/
>>25103 Here is the forum post that announced there involvement with the D foundation, that should hopefully give you an idea of what the relationship is and what they are doing for the foundation. https://forum.dlang.org/post/avvmlvjmvdniwwxemcqu@forum.dlang.org Hope this answers your question
>>25126 Thanks for the link Anon, I'll peruse it.
So I learned something different that could be very significant for robust error free waifus. I'm looking at Ada language for kicks. I start following links and find something I didn't know. Adacore which is connected with Adafruit a company that makes lots of different single board computers and electronics parts is really into Ada. Especially Spark programming language. Spark is a subset of Ada used to trim down and make it even more safe. The idea being that you have to declare everything and the compiler catches most every mistake to leverage out bugs. Hackaday has an article on this. Some say it;s BS but a lot say that Spark, and Ada, really crush the likelihood of bugs. There's no doubt that lots of things that just have to work, space shuttles, space stations, F-22, F-15 and lots of medical equipment use Ada or Ada Spark and have strong protections against surprise bugs. https://hackaday.com/2019/09/10/why-ada-is-the-language-you-want-to-be-programming-your-systems-with/ I found a video where Nvidia says they are going to move to all spark programming because of so many of their chips used in task critical areas like self driving. https://www.youtube.com/watch?v=2YoPoNx3L5E Here's a link on learning Ada and Spark, https://learn.adacore.com/ In the link below it gives a huge number of mission critical items that used Spark. https://en.wikipedia.org/wiki/SPARK_(programming_language) While Spark and Ada may not be glamorous or crafty like LISP they often work exactly as programmed without surprises the first time. Worth thinking about. I wonder are there tools that will accomplish the same thing with C++ or other languages without using Ada or Spark? https://www.hackster.io/adacore
>>25214 You know I think a lot of you Grommet, so please don't take this personally, but > Adafruit They are a great iconization of Leftists in general, and will be no friends to /robowaifu/ and our ilk once the 'rubber meets the road', so to speak. The name itself should be a big tipoff. Simply research the founding/history of the organization, Anon. Use them; that's fine as long as you take everything related to them with a big grain of salt. I'd personally much prefer Newark or Black Box for supply of electronics, etc. Not only are they much more well-established in the engineering domains, they are much less likely to be completely overrun by Filthy Commies. :^) > Ada I've mentioned the benefits of Ada numerous times here on the board; also why I feel it is primarily-disqualified for us. The hardware onboard aircraft doesn't 'magically' get safety just because Ada is being used there. It still requires strict engineering discipline and rigorous testing (at huge costs: Ada is a very expensive language to use in general). And no system is 100% safe. > I wonder are there tools that will accomplish the same thing with C++ Both D and C++ can be written in highly safe ways. However D has other challenges for us here, so I can't recommend it (at least yet). C++ has a very strong impetus for a pared-down, sensible/sane usage of the language known as the C++ Core Guidelines [1], which I intend to teach some of the most important basics of in our secondary classes here (including automated tools for checking CPPCG compliance). 1. https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines >=== -prose edit
Edited last time by Chobitsu on 09/05/2023 (Tue) 13:45:46.
>>25215 >don't take this personally No. I had heard of them(...a great iconization of Leftists...)but actually came at them through Ada. I found that the adafruit people somehow have some sort of affiliation, however tenuous, with Ada which I didn't know about. They link each other. The people that did Ada I think made it too much by committee. The Ada fanatics say the newer versions have fixed a lot of this. They have further fixed this, I believe, by paring it down to a subset called Spark. Probably the most retarded name for for software ever. How many things are called Spark? Way too many. When you say Ada is "expensive" I think of it totally differently. It's well thought out, takes a whole approach to things and is not a hacked up abomination that C is. (I've been reading the "Unix Haters Handbook" again) If these people had real smarts they would make a good higher level Spark program, like they have, then they would get some sort of miracle C/C++ hacker and make an extensive C/C++ program that would test all the basics of the registers and all other functions of a processor or microprocessor, (in C/C++), and then just read what it compiled to. Then equate the complied machine code results to Spark commands and back compile a compiler for every processor they could find while leaving the high level Spark tools and programming syntax in place. There's not enough compilers for Spark for all the processors and making them is not easy. Why they don't use computers to do the things they are supposed too, drudgery work, and automate it. I have no idea. Likely because the Ada people hate C and the C people hate Ada so nothing good ever happens. Worse is better. I fully, I believe, understand why you like C/C++ so much. It gives you complete control but unless you have years of experience it can cause all sorts of grief. Stuff like Spark and Ada are essentially boring as they do things the right way. Ever heard, I know you have, the https://en.wikipedia.org/wiki/Worse_is_better That's what C/C++ is, worse. Let's note that while C/C++ might be easy to whip something up. All the time you spent with all the gotchas and memorizing all the gotchas you could have spent a little more time with something like Ada and got it right in the first place. C/C++ only seem easier after spending years hacking away at it. Stuff like Ada make take longer to set up but only because you have to get it right or it just bleeps at you and tells you, you fucked up. C will take whatever you throw at it and promptly throw up bits on your keyboard.
>>25219 > conflating the ultra-pozz den Adafruit, with the 40+yo US DARPA/DoD overarching, sprawling, mandated, programming language (rapidly 'replaced' by the MIC industry using C & C++ instead, b/c of Ada's intractability fail -- especially during the first couple decades of the enforced debacle). < shiggy > I fully, I believe, understand why you like C/C++ so much. I don't 'like' C++ except insofar as it is literally the #1 most-likely means for /robowaifu/ 's successful & complete, systems software solutions; to give all of us entirely-unencumbered, practical, realworld robowaifus. > That's what C/C++ is, worse. With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly (cf., search your own use of the term 'worse': ITT ). By all means Anon, knock yourself out. You've expressed an interest in learning how to program software. Go ahead and tackle Ada as your language of choice. GCC offers a great front-end for the language via their amazing compiler system. [1] After enough experience, you too will understand why so few engineers willingly adopt the language on a personal level in actual production systems, and why it is literally one of the most expensive approaches to software development known today. > tl;dr Let's give any further programming language discussions between us a miss, friend Grommet. If you'd like to participate in our own /robowaifu/ programming classes here, and as long as you stay on-topic (ie, C++ programming, etc.), then of course that's an exception to this mandate regarding yourself. Cheers Anon, and kind regards. :^) 1. https://gcc.gnu.org/wiki/GNAT >=== -prose edit
Edited last time by Chobitsu on 09/07/2023 (Thu) 18:03:07.
>>25243 >With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly Fair enough and a fair criticism. I'll try to take it to heart. I do make mistakes and also try to admit it when I do. My apologies.
I was looking into Elixir (lang) for creating my implementation of the cognitive architecture. That said, I keep an open mind and found this here for C++: https://github.com/endurox-dev/endurox https://en.wikipedia.org/wiki/Enduro/X >Enduro/X is an open-source middleware platform for distributed transaction processing. It is built on proven APIs such as X/Open group's XATMI and XA. The platform is designed for building real-time microservices based applications with a clusterization option. Enduro/X functions as an extended drop-in replacement for Oracle Tuxedo. The platform uses in-memory POSIX Kernel queues which insures high interprocess communication throughput. > ... AGPL It also has Python bindings. Hmm. Anyone worked with this?
>>25219 >>25214 I’m a little late (don’t have a lot of time to be checking robowaifu) but I think I can say something of value for Grommet. While good languages can have less foot guns and be designed to make mistakes harder. A language should not be chosen "for being safer" or for any other memed purpose. Are you picking Ada because it solves problems for you or are you picking it because it’s "safe and runs on my heckin airplanes TM"? >but unless you have years of experience it can cause all sorts of grief. This is true, that’s why I advice starting now and start clocking in hours and gain experience and confidence. Even if the language stops memory bugs, there are so many other mistakes can and will make. You can write good code in almost any language, Just pick a language and start writing, try to learn the language as well as you can, don’t be stuck in analysis paralysis. I guess there is some irony in me saying this in a thread called "Selecting a Programming Language" but I think it’s the most helpful thing to say. If you want safety I recommend you take a proven approach, Test everything!!! you need 100% coverage and the tests need to be good and not overfited. SQLite is a perfect example of this. It’s very reliable, runs on le hecking airplanes & is bug free while being written in pure C. it’s not a stagnant codebase of untouchable C, because the tests are good, they can afford to do large rewrites between even minor releases.
I'm played around with a rather niche language today: Lobster. It has a pythonic syntax and some ideas similar to Rust in regards to borrowing. So it might be useful for doing something that needs to be secure but easy to write. It's also quite fast, can compile to C++ as well this code then can be used for WASM (binary in the browser). The language was mainly meant for creating games, so it has a strong focus on graphics (OpenGL). I'm experimenting in a direction towards using the frames of a video made with "SadTalker" like I posted here >>26029, to have a fast way to make an avatar talk without rendering it with an AI model every time. Imagine a small program that takes in a stream of text and creates speech with the right movements of the lips, not looking human-level "realistic" but reasonably well and without any relevant lag. So for I managed to make it load images and call the text-to-speech service on Linux. Both works very fast. So fast, that I also can overlay more than one of those, so this might help with making it smooth. To make this work at some point, I will need to make a picture sequence (frames) for each syllable or something similar, and probably also with some combinations of how the head is positioned. Then the program would load those, based on the text input while also creating text to speech output. This can help to investigate animated virtual girlfriends as predecessor of robowaifus. I also imagine this to be useful for AI creating simple simulations. I don't know how to use that yet, but I have some ideas, and here's a website that inspired me: https://concepts.jtoy.net - I think about it so, that training a model to recognize a pattern in a video input (her own vision, but maybe also watching TV or a game) and match what is going in with one of those concepts. https://github.com/aardappel/lobster https://aardappel.github.io/lobster/builtin_functions_reference.html https://github.com/OpenTalker/SadTalker/tree/main
>>26092 Neat! This seems like a really promising investigation Anon. I certainly agree that Visual Waifu development, if done well, will probably clean about 50% of the tasks across the board from our table toward making great robowaifus. Please keep us up to date, NoidoDev. Cheers. :^)
>>26093 I'm struggling a bit with the Lobster language, since I'm not used to compiled languages and it is not very well documented. I tried the same thing in Python with help from chatGPT. Dev time was way shorter, only like half an hour, it takes much longer to start the program though. Now I'm looking into compressing the frames somehow by removing common areas between two images. The frames are circa 80x of the size of a short video (450MB, 256 colors).
This is VERY COOL. A "Scratch" like visual programming for micro-controllers. ESP32 included. Has a multitasking OS using byte code. OS is 16k. Can run n a browser or downloaded program. Can be used to test programs in a virtual micro-controller on the web or, I think, built in the program. http://microblocks.fun/ I've been reading a lot of Alan Kay's stuff so this makes sense.
>>26245 Okay. Not sure how much one could do with that and if it was suitable for doing something related to robowaifus, but it might come in handy.
>>26245 Neat! Thanks Grommet (good to see you BTW). I love Scratch, and think it's kind of visual interface will be highly valuable as a scripting interface for us, once robowaifus have become commonplace enough for Joe Sixpack's to be clammoring for their own.
Python Sucks And I LOVE It | Prime Reacts https://www.youtu.be/8D7FZoQ-z20 tl;dw: Execution speed is only one thing, development speed matters often more, and getting things done even more. Builtin types are in C anyways. Parts of the code can be transferred to Cython. Not mentioned: Mojo is around the corner, and based on Python.
Edited last time by Chobitsu on 11/08/2023 (Wed) 03:42:12.
>>26260 >Mojo is around the corner, and based on Python. Yeah, I'm curious if Modular will ever decide to end their commercial goals surrounding Mojo, and release it free as in speech to the world with no strings attached. It would be a shame for anons to get mired into some kind of Globohomo-esque tarbaby trap with something as vital to our robowaifus as her operational software. >=== -prose edit
Edited last time by Chobitsu on 11/08/2023 (Wed) 03:55:26.
Mojo was mentioned but no mention of Julia? I haven't had experience in either but Julia seems like it might be better for more people here that already know Python. https://www.datacamp.com/blog/julia-vs-python-which-to-learn https://exploreaiworld.com/mojo-vs-julia However when I check on some people doing benchmarks it's not uncommon for Mojo to be faster perhaps because it takes more effort to optimize Julia code or they just had a flawed method of comparison though it is possible Mojo is actually faster. However Mojo isn't even open source currently though as far as I'm aware and may never actually be, but Julia always was open source and has been around a while now. In either case both are faster than Python when dealing with large datasets as far as I can tell. Julia can call on languages like C, C++ Python and Fortran libraries. Am I mistaken or isn't this likely better than Mojo and Python? Python being popular doesn't mean it's the best, it's just a general purpose language with a simple syntax so a lot of programmers know it but the people in ML research are adapting Julia more recently at an increasing rate although it's not anywhere near popular even there just yet.
Great, newbie-friendly introduction to ASM programming [1] by one of the self-taught masters, Matt Godbolt; creator (from scratch!) of the world-class programming analysis tool, Compiler Explorer [2]. Highly recommended video for anyone trying to understand what assembler/machine-code programming is. Good investment of 20-minutes for robowaifuists, Anon! :^) >note: Also the short, old book he mentions at the beginning of the video is available on Wayback. [3] --- 1. https://www.youtube.com/watch?v=8VsiYWW9r48 2. https://godbolt.org/ 3. https://archive.org/details/machine-code-for-beginners >=== -minor edit -add 'Wayback' hotlink
Edited last time by Chobitsu on 04/09/2024 (Tue) 03:00:28.
>>30828 Thanks, I'll watch this when I also have time to play the "Human Resource Machine" which is a learning game for that.

Report/Delete/Moderation Forms
Delete
Report