/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Downtime was caused by the hosting service's network going down. Should be OK now.

An issue with the Webring addon was causing Lynxchan to intermittently crash. The issue has been fixed.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB


(used to delete files and postings)

Anon and Vera collaborated closely, with Vera helping Anon refine and improve his code and algorithms.

Selecting a Programming Language Robowaifu Technician 09/11/2019 (Wed) 13:07:45 No.128
What programming language would suit us and our waifus best? For those of us with limited experience programming, it's a daunting question.
Would a language with a rigid structure be best?
Do we want an object-oriented language?
How much do you care about wether or not a given language is commonly used and widespread?
What the fuck does all that terminology mean?
Is LISP just a meme, or will it save us all?

In this thread, we will discuss these questions and more so those of us who aren't already settled into a language can find our way.
>>23474 I tried play.sound(output) and it gave me the following error:
>>23475 [ALSOFT] (EE) Failed to connect PipeWire event context (errno: 112) I don't know why it posted by itself. Anyways that's why I added a bash script...
>>23476 >[ALSOFT] (EE) Failed to connect PipeWire event context (errno: 112) I'd suggest you a) switch to Mint or Ubuntu first, and b) search that exact error message (in quotes) if it still happens again. Installing a new machine shouldn't take you more than about 15 - 20 minutes or so on reasonably newer hardware, so low-level investment of time to use that route. Good luck!
>>23477 I just used another terminal and it worked B)
>>23461 >Chris Lattner_ Future of Programming and AI _ Lex Fridman Podcast Eyy, that podcast is what I wanted to recommend. He's involved in Mojo (Lang) and some other platform: https://youtu.be/pdJQ8iVTwj8 to make AI development faster: >modular The Modular Engine unifies AI frameworks and hardware and delivers unparalleled performance and cost savings. https://www.modular.com
>>23486 how can a language even have anything related to performance
>>23479 GG >>23486 >The Modular Engine unifies AI frameworks and hardware and delivers unparalleled performance and cost savings. Haha reads like ad-copy, Anon. :^) >>23487 In a lot of different ways. Generally, the closer the language 'sits' near to the actual hardware itself (that is; the CPU, it's memory subsystems, etc.), then the higher the performance. C & C++ are the two general-purpose languages that have excelled at this long-term (for about 50y & 40y, respectively).** ASM (machine-specific assembler) is really about the only practical language at a lower level than these two. The tradeoff here (all engineering is a series of tradeoffs) is complexity. As developers, the closer you are to the hardware, the fewer 'nets' you have to protect you (and you have more freedoms, too) -- you must manage many things directly yourself. But systems devs don't generally want many of those nets anyway, because it 'bloats & slows' the system. C++ & C (with a little bit of ASM) are our preferred systems languages for creating IRL robowaifus; in that order. These three choices are most directly-related to the language's performance & stability. --- ** Side-note: they are also both ISO international standards (a form of sovereign treaty), so they are extremely stable (a yuge win for /robowaifu/, et al). >=== -prose, fmt edit
Edited last time by Chobitsu on 06/25/2023 (Sun) 21:13:35.
>>23491 >Haha reads like ad-copy, Anon. It does buty...this guyhas major sills. He has doe some really heavy lifting. Clang compiler, swift language...and every ne of these he has seen what can be done, what can't and what they didn't do. I personally think python is icky but, people way smarter then I disagree. Not difficult to find them. My understanding is he is also rewriting the python C compiler, or tuning it up. The speed increase should be high. I linked a guy the other day that has been programming for 40 years and he was, and is, a huge fan of Rebol. He is now using "Python Anvil framework". He says while it's huge it does everything extrmely fast. He devolps for people online and makes his GUI's i abrowser. He says Anvil just breezes through everything.He can in real time change aspects of his clients through a browser or give them several choices. He says they love it and his work is super fast. ne of the keys he says is python has these massive libraries so he can stitch together stuff super fast for anything he can thing of there's likely a library for it. He's also likes the browser "metro" interface elements and uses it a lot. http://www.rebolforum.com/index.cgi?f=printtopic&topicnumber=46&archiveflag=new It's like JavaScript. Not the best but it's everywhere. I suspect python with Mojo will do the same as you will be able to use all this python libraries and then tie that into AI code AND AI specialized hardware with Mojo. @ >>23487 If you watch the video I linked @ >>23461 You will see he talks about exactly that subject.
>>23507 >It does buty...this guyhas major sills. He has doe ARRRGH. My typing so poor. Apologies. I forget that little ass window and sometimes forget to scroll up and check.
>>23507 still just a language, if its just for ai then with ai its not so much a language problem, more so that gpu makers dont publish their isa like with cpus ( the use of gpus itself is purely adhoc they are only used because its hundreds of small cpus in parallel that execute in waves, ideal for graphics processing which just so happens to overlap with nn, its still not designed for this no matter the marketing ), so ultimately it doesnt matter what language you use so long as no one can make compilers for gpus, ever language has to use their shitty driver api it doesnt really matter how fast you call the api, no one wants to use an api to begin with,, the closest you can get is spir-v which is still just an api in disguise, either a hardware manufacturer shows up that doesnt have a stick up their ass and makes simd processors similar to a gpu but not designed for graphics with a shitty api just general simd computation or they all finally agree on a standardized isa (not a fucking api ((spir-v/opencl)) ) like what happened with cpus and x86 although i think that only happened because of lawsuits not the greater good
>>23507 >He has doe some really heavy lifting. No I totally get that Grommet. And I don't really think I have a case of either arrogance, nor foolishness in this situation either. With humility I suggest its quite the opposite in fact. We are all in the middle of a yuge sociological warfare, that evil forces have arrayed against all virtuous manhood (males specifically), to destroy both us and indeed the civilization we have built, that the entire world rests upon. These may seem like grandiose terms -- until you simply look around you. The Globohomo Big-Tech Government is very clearly engaged in a methodical attempt to eradicate the White race, followed by men in general. Their machinations are the sole reason that /robowaifu/ even exists in the first place; to fight against the degenerate effects of the Globohomo's literal favorite pet: feminism. They have already managed to do great destructive evil with it as every regular here is quite well-aware. Remember how I said "...but I certainly applaud any non-pozzed, systems-oriented languages" (>>23470) before? Well feminism & all the other -isms that immediately follow-on from it are all part of this exact same phenomenon going on within the software & technology fields generally. Why do you think we here on the Internets call G*thub SJWhub? Or it's loud, politically-correct proponents CoC-suckers? Tech is clearly a big weapon in the hands of the GH and it's all-too-willing golems. They will cancel you out, and do everything they can to hinder or stop you if you refuse to toe their line of affirming all their usual suspect's usual, evil bullsh*te. But thankfully, by the grace of God, we have some software tools that are largely immune to their shenanigans & meddling. The two most important ones being -- you guessed it -- C & C++ . This is due to the fact that before the GH made it's big lockdown moves post-9/11, post-Gamergate, post-Current Year, these two very powerful languages were already cast under the auspices of the ISO, and had already been adopted as international standards. This means that the countries themselves must cooperate together to maintain the status-quo with these language definitions, and that no single GH entity (like, say, A*ple or G*ogle or the Wh*tehouse) alone can pull their cancel-culture plug on any groups """improperly""" creating powerful anti-feminist systems like robowaifus """abusing""" these programming languages & related software technology, etc. This wonderful situation for C & C++ is markedly different than, say, R*st, where -- let's be honest -- 'CoC-sucking' is their middle name haha. :^) >tl;dr As incredibly politically-incorrect as we all are here, we simply cannot put ourselves at the """tender""" mercies of these demonic organizations that are attempting to abscond with the entire technological basis of civilization around us. We can't. >ttl;dr Don't drink their koolaid bro. Remember who is in back of, and driving these agendas. They are no friends of humanity. >=== -prose edit
Edited last time by Chobitsu on 08/05/2023 (Sat) 08:47:20.
>>23513 >I don't really think I have a case of either arrogance, nor foolishness If you read it that I was accusing you of this, then I didn't word it right because I in no way meant that. I'm fully, really fully on board with all you said. The whole entire, manufactured, culture we have is vile and evil. I'm not always so good at expressing what I mean, though in my defense most of these things are complicated. When I keep pushing other languages it's mostly not because of my not believing in C and C++ capabilities. It's because I do not think MY capabilities on such difficult languages are up to the task. I expect that I'm not the only one in this class, so I try to finds ways around this. I comment on what I find that can possibly make this task easier. I do believe that there are easier ways and better languages that can do the same thing that C++ does with small performance hits that with modern hardware amount to not so much. There's always the basic "the enemy of the best is the good enough". I'm looking for good enough some places and the best in others. (My hatred of seams in skin suits). Now the video I mentioned here >>23461 and linked here by NoidoDevhere, >>23486 I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. Now maybe he's full of it but if he can pull this off it will be very beneficial. One thing I really like is he is doing all the low level work. He is expressly saying things that I worry about. He says that he is looking at the low level, the physics of how processors work to get the best speed then he is building a framework to work with, and he is specific, on the thousands of different types of graphics, normal and specialty, processors. Then by making this framework and laying python over it you can stay in the higher level functions and still get high performance. Yes there are some additions. The gist I got of these so far is typing, and setting up the ownership of various variables instead of pythons allowing this to not be specified(for speed. There is more but I don't understand it yet). However his goal is that if you do not wish max performance it works like regular python. He's talking a year before they have a hard core package but some of it works now. Now for you this may be no big deal for you but for most, who are not C++ programmers. Being able to string together libraries to do task is super great stuff. This has real world consequences. had a idea. What does it cost to increase power? I lined a chart of processor power, in general, and cost from Tom's Hardware.(well this failed so you have to look it up yourself) Let's say you use the lowest cost $99 to get a waifu walking around. But then you want speech and maybe a few other functions. To double the power cost you about $300 more. Now what is the cost of the time to write all this code in C++ compared to stringing together some python libraries and using this guys Mojo to speed it up. When you start comparing the difference in time spent on this to real dollars, higher level languages start making a good deal of sense. And since the processor power keeps going up and cost lower then upgrades, while using higher level languages to upgrade without all this bit twiddling, become even more attractive. Now Moore's law may have slowed down in density of transistors I don't think it has slowed much on over all increases in power. They are paralleling things which is fine for our task. And even the so called "end" of Moore's law is premature according to Jim Keller, who is not some guy living under a bridge. I've seen talks where he lays out plans for a path to 50x gate density.
Well in this case the fact that it's made in python does matter cause it's pretty slow, although I'm not sure if it's the programming language or the api. I'm going to redo it in bash.
>>23524 >I've been watching this. It's so deep, for me, I can't watch it all at once and have to back up several times. I'm downloading a lot of such talks as audio, then using them like (audio) podcasts. The downside of this is, that making notes is hard while walking around and a bit of information in form of facial impressions and such is lost. Generally it's a good thing though if one has to wait somewhere or want's to walk around for some hours. Just make really sure to not walk into a car, if you try the same. You don't need to understand everything at once but replay it later, but also I recommend to use podcasts for programming beginners first. I can't give you a good hint here, since mine was in German (Chaosradio). I think I listened to English ones as well at some point, but don't remember a specific name. "Talk Python to me" podcast might also explain some details, but it's mostly about specific libraries. It can still help to listen to it, even if you won't use the library they talk about, since they explain why it's necessary and talk about Python in general.
>>23525 no yeah it was the api. Assemblyai is too slow and aws transcribe requires that it be uploaded ot a bucket so I guess that leaves google transcribe.
>>23528 I'm using aws transcribe since google won't take my card and aws demands that you upload the file to an s3 bucket and then outputs as a json. Really this has been exhausting...
>>23528 >Assemblyai Garbage website, which doesn't allow me to register. Just doesn't, without an error message. >>23530 >aws transcribe Did anyone here try to use Whisper? Or are your GPUs so weak, that you can't? Though, I think small models run on a SBC. That said, if that's not about programming languages, then we're OT here. We have a thread on Speech generation which is unofficially also about speech recognition: >>199 or the thread on NLP >>77 which might be the best suited for the topic of speech recognition.
>>23533 fuck off redditor. You're the problem
Open file (168.28 KB 849x458 26251282793756.gif)
>>23534 sirs this is a language thread do 1 thing and move thread do the needful thenks
>>23533 Its not a resource issue as much as making it more straightforward issue. I think aws transcribe and polly is more straightforward than running it locally and I think people that will want to talk to the waifu bot would be better off using that either way. Keep in mind the waifu will do most of its computing wirelessly on the computer since I'm not going to try to fit a gpu on it. Never mind the assembly ai though, its in the python code but I ended up ditching it for aws transcribe on the bash code. Consider the python one abandoned.
>>23534 Point taken, but if you're posting nothing more substantial than an ad hominem then please at least wait to do so for a post with the 'evidence' directly to hand in it. Less confusing that way. TIA Anon. :^) >>23536 >that ad though Lol. They just don't make the classics like that any more. Please to where I can log into Mr. NagoorBabu's important classes immediately? 1 free Internets to whoever can adapt something close to that into a robowaifu-centric banner gif for us (>>252). :DDD
>>23540 I don't think that's funny. I'm out of here. >see you tomorrow No really i think i had enough of this nonsense.
>>23541 Okay i don't want to be to rash but really if you could tone down the white supremacist thing I'd appreciate it. Cause really this has nothing to do with it.
>>23550 If you're referring to me Anon, in no way am I espousing a 'white supremacist thing', I'm simply stating the actual situation in the real world. Neither will I 'tone it down', if by that you actually mean 'don't ever mention that again'. We're 'in favor' of all men here (even our obvious enemies like those working for the GH). Their gonads are what's important in this matter, not their heritage. :^) Clearly you're angry, and while I get that, you're not really contributing much to the welfare of the little community here by your regular REE'g at others. Maybe you can tone down the complaining Anon? TIA.
>>23551 Really there is line between having thin skin and putting with a constant barrage of polfaggotry. But okay.
>>23552 Lol. I frankly consider your skin to be the thin one. I don't care if you're racist or not -- it's completely irrelevant to me. I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. A good way to not be destructive is to, and I quote, "Don't be a dick". You're being a dick Anon. :^) >Conduct on /robowaifu/ >-First, the two basic rules of the board >> 1 : Spoiler NSFW content. >> 2 : Don't promote either feminism or literal faggotry outside of The Basement Lounge. >https://alogs.space/robowaifu/rules.html >-Second, and simply; don't be a dick. >> "And as you wish that others would do to you, do so to them." >> -t. Jesus Christ >https://biblehub.com/luke/6-31.htm (>>3)
>>23552 To me the issue is trolling and that includes nonsense like attacking people for using Reddit and Discord, or creating OT debates. We had this issue a while ago. Until we got a voluntary, which then started to ban everyone who he didn't like or in his mind shouldn't be here. Which turned out to be more or less everyone. The poster in >>23550 didn't state what his problem was. If it was "raycist jokes" then that's the weakest argument and we don't know who he is, so who cares. This board has it's roots on 4chan and later 8chan. It's better to have some jokes which some people find "racist" than having people being offended by that taking over. That said, >>23536 was directed at me, and I'm not Indian nor programming Java, though I may pick up Kotlin or Clojure one day. So aside from the picture being funny, it's pretty stupid. I kinda ignored this, but I'm suspicious about people complaining about it may be the same or of the same group, trying to disrupt our conversations here with shitpostings and flame wars (between sock puppets if necessary). This even includes bringing this shitty thread to the top again and again... though others might do it without bad intentions. The latest topic here was Whisper and people using online services for speech, which is OT in this thread. Using online services for speech recognition is also generally discouraged if used beyond prototyping, which at least should be mentioned. Before that it was once again a discussion about C++ vs Python. This thread is not the best one, I'm considering hiding it again, so that I don't have to see it. >>23553 >I only care whether you're acting as a destructive agent against /robowaifu/ and it's regulars. This. But you might giving it too much leeway already.
>>23557 >But you might giving it too much leeway already. Trust me, I understand your view NoidoDev. You mentioned the debacle before. I waited long on that in hopes of reconciliation as it's the right thing to do(tm). :^) That's just generally my moderation style, and even when I do act, its usually just to quarantine the posts in question (if they have any redeeming qualities at all, such as humor :^).
Open file (14.29 KB 512x512 dman_ai_1.png)
Open file (57.32 KB 673x496 pyd.png)
I don't see my favorite language has here, Let me introduce a great programing language that most people do not know about, The D programming language. First I'll give the buzzword filled blurb that most languages do: D is a safe static typed multi paradigm systems language with a C-like syntax. Why do I think this language is a good fit here, its a very productive language especially for a single developer. One undervalued aspect is that its not a "big agenda" language where it's focused on one design aspect at the price of all others and instead focuses on being generally productive. Having come from C++, the super fast compile times and saner design are such a breath of fresh air, D is to C++ what Go is to C. But Simply describing D as nicer C++ is not doing it justice, it has aspects that make it stand on it's own. D has the best Template system, Compile time introspection, function execution and code generation I have experienced in any language. It’s trivial to take an input and to generate d code at compile time, there is no need for complicated build scripts that generate source files as separate steps or any nasty text preprocessors. Using a library called pyd you can get nice Python & D interop, this is one of the reasons why I think D is a good fit for robowaifu. Another awesome feature is the GC, I know C&C++ people will roll there eyes, but I’d argue it’s actually really good for being productive. Unlike a lot of languages with a GC, its not forced using it is optional, C&C++ style manual memory management is a valid approach. D provides the @nogc attribute to ensure at compile time a function will never trigger the GC, Anyone writing high performance code should already be profiling and if your inside a hot spot as long as you don’t allocate GC memory you will have the same performance as C or C++. Finally there is the safety aspect, I am not going to argue that D is equal to Rust in this aspect, it’s not. But D does gives good tools to help ensure memory safety. D naturally has less foot guns then C&C++ and the @safe D subset is nice. I won't go into more detail, here is a link https://dlang.org/blog/2022/06/21/dip1000-memory-safety-in-a-modern-system-programming-language-pt-1 I will not pretend that D is perfect, it has its problems. It’s not a popular language, It’s community is small. Phobos (the std lib) is not perfect for @nogc code. Etc.. I can elaborate if anyone wants. I try to approach programing language discussions by talking about positives and not attacking other languages, but I will just say that, I bring up D as a counter to C, GO, C++ & Rust. I have used C++ for a long time, I have used Java & node. I have tried out GO and Rust. Go gets a lot right, I love it's fast compile times, it’s not a bad language, but it’s made for a corporate/large environment with a large code base and an army of expendable employees. Keeping it simple is very useful, it ensures the code base is maintainable over time even as people come and go. But for a small group of developers or an individual its restrictiveness is not helpful. One graybeard template wizard can be more productive than 10 Java monkeys. Then there is the Rust meme, the language has good ideas, and has had a good impact on language design, it has gotten more programmers thinking about memory safety. But it’s not the only important factor. What ultimately killed rust for me was how slow working in it is & I am not talking about borrow checker errors. I’m talking the slow compile times and the rigidness that builds up as your project grows, it’s often very hard to refactor a Rust (and C++) project. In my experience D while not perfect is actually really good at making projects enjoyable to work on. I don't think anyone here is getting paid so developer enjoyment becomes important for motivation.
>>24667 >I don't see my favorite language has here Brilliant, I just noticed the typo within the first few words, despite reading the post before posting -_- I hope people still read this and don't write me off, I'm not a complete retard. it should be "I don't see my favorite language here"
Open file (91.82 KB 736x552 chii_ponders_2.jpg)
>>24667 >>24668 Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) Thanks for your inputs! Alexandrescu is a brilliant man, and has helped the C++ community a lot. D is a great idea of his (and could potentially have supplanted C++ very well), but unfortunately it can't really make much practical headway in the real-world systems programming domain (that is, a world completely dominated by the C & C++ programming languages) to effect real change. This is probably why he himself abandoned his own language a few years back. >G* & R*st Both have some good ideas, stole their concepts from proven systems already written in C & C++, and -- particularly in the case of R*st -- their communities are, roughly speaking, the equivalent of a pride parade down the middle of some conservative Technology Main Street town that doesn't care about or want any faggots around. >tl;dr It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex is not at all helpful either... for anons trying to create unencumbered, opensauce robowaifus to serve men the world over. >tl;dr Both languages are far too toxic and problematic to be of much use to us here tbh. --- Glad you stopped in Anon. I think your pic is a cute one -- your robowaifu a cute! :^) BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here.
>>24667 probably the only language that got inline asm right, nice and simple without all the goddamn fluff and insanity like making every line a string or giving variables by name but then only being allowed to use them as a number represented by the order in which you listed them oh and dont forget to specify if its a global/memory/pointer/register/etc because clearly the compiler cant just fucking figure it out from the declaration d seems good if you want to mix high level and low level code, like actually mix not just using external functions
>>24669 >Hello EnvelopingTwilight, welcome. Mind if we just call you 'Twilight'? :^) I don't mind, feel free to do that. >This is probably why he himself abandoned his own language a few years back. Alexandrescu did not abandoned D, He stepped back because he prioritized his family over a programming language (very based), here is him saying that with his own words (with more details) (1). He is still with us, he actively participates in the D foundation and attends the meetings and he is absolutely still around. If you listen to the rumor mill and read the doom posting within the D community you can get an impression that the language "is dead" or "dying" or that leadership sucks. But that is not the case, the people at the top are very skilled programmers. That being said, the D heads have social skills of programmers and not "people" people. Some don't like, but I love it, the top is not made up of "Public Relations" do nothings. The D community is very self-critical, often to a determinantal level. You will see DIPs (proposals for D) "die" & then people will complain and say D is dying or that the language is stagnating, and yet with new releases I find quality progress that no one will celebrate. If you want a change in D, write a quality pull request and you will find that its actually not hard to get stuff into D, what is very hard is to get others to do that for you. Don't be an "idea guy". If you use the D standard library you will be running code I wrote, this how I know this is true. >It's their communities that are the primary problem; but the fact that both organizations behind them are squarely within the Globohomo Big-Technology Government complex I did not want to get into bashing language, but why not. I 100% agree, the Rust community is the worst group of people I have had the displeasure of ever interacting with. Rust has its roots in Mozilla, that alone is a red flag. The culture at Mozilla is some of the most extreme mental illness on the planet. Here is an article I have ran into that does a good job showing how horrid Mozilla is (2) & why you should never give them a cent. If you use firefox, please use Librewolf (3), that's my daily driver. Do not support Mozilla, they actively fund left wing extremists that hate you. Another thing I will bring up is Actix web drama, when the community was such aids over the unsafe keyword that it pushed the developer to quit. If I was in that position I too would say that "I am done with open source". >BTW, we have an Embassy thread (>>2823) if you'd like to introduce yourself/your community to us here. Sure, I'll make a post & say hi. tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) 1: https://www.youtube.com/watch?v=cpTAtiboIDs&t=3049s 2: https://lunduke.locals.com/post/4387539/firefox-money-investigating-the-bizarre-finances-of-mozilla 3: https://librewolf.net/
>>24680 >he actively participates in the D foundation and attends the meetings and he is absolutely still around. Excellent! This is very good news to hear Anon. D is a very nice language and would be a good candidate for a robowaifu systems language as I've pointed out here before. But the fact that it was (by all appearances) abandoned by it's #1 developer pretty much disqualified it from current consideration. Also, it's not yet an international standard (and therefore quite-susceptible to Globohomo-incited corruption). I know Andrei works closely with the members of the C++ Standards Committee, do you think it likely that D will be made into an ISO international standard anytime soon? That would be a tremendous boost to legitimately investigating using it as the primary robowaifu systems programming language. >If you use the D standard library you will be running code I wrote, this how I know this is true. Neat! I've often toyed with the idea of joining the C++ Standards Committee myself, and I'm exactly the type that Bjarne Stroustrup is clamoring for to do so. That is, I'm an end user, and not a corporate-interest shill. >Do not support Mozilla, they actively fund left wing extremists that hate you. Indeed. All the Filthy Commie cohorts, and most of the general Leftist ones, hate what /robowaifu/ and it's cadres stand for with a passion. This is as they have been programmed by the Globohomo indoctrinations to be of course -- all by design. Marxism is a cult religion for these golems; we here are in flat opposition to all that by empowering individual men against their machine. :^) >tl;dr having a smaller community not built out of hype and not backed by any large globohomo corporations is actually kinda nice :^) Yes, we are all of those things Twilight. Looking forward to your Embassy post. Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 08/21/2023 (Mon) 00:56:03.
>>24687 >do you think it likely that D will be made into an ISO international standard anytime soon? No there is no interest in that at the moment, I do not see that happening soon.
>>24762 OK, fair enough. It's a big commitment, and can also be incredibly-convoluted as a formal process. On the plus side however, being an international ISO standard makes any language much more robust in the face of opposition to """controversial""" uses -- such as creating robowaifus! :^) >=== -prose edit
Edited last time by Chobitsu on 08/24/2023 (Thu) 14:02:31.
>>25020 First talk is a skip
>>25020 Thanks Anon, I'll check this year's event out at some point soon.
Dconf has ended, I have commented the time stamps for each talk (except day 1, someone else did that). For people who use the SponsorBlock addon, I have submitted the Intermissions. https://www.youtube.com/watch?v=uzuKqiFVNZM https://www.youtube.com/watch?v=wXTlafzlJVY https://www.youtube.com/watch?v=CMrb6ZWhqXs For more information on the talks the schedule is here https://dconf.org/2023/
>>25102 Excellent. Thanks for all the hard work, Twilight. So, one of the language's important principals is closely associated with Ucora. Can you give me any further insights on them beyond link-related, Anon? https://www.ucora.com/about/
>>25103 Here is the forum post that announced there involvement with the D foundation, that should hopefully give you an idea of what the relationship is and what they are doing for the foundation. https://forum.dlang.org/post/avvmlvjmvdniwwxemcqu@forum.dlang.org Hope this answers your question
>>25126 Thanks for the link Anon, I'll peruse it.
So I learned something different that could be very significant for robust error free waifus. I'm looking at Ada language for kicks. I start following links and find something I didn't know. Adacore which is connected with Adafruit a company that makes lots of different single board computers and electronics parts is really into Ada. Especially Spark programming language. Spark is a subset of Ada used to trim down and make it even more safe. The idea being that you have to declare everything and the compiler catches most every mistake to leverage out bugs. Hackaday has an article on this. Some say it;s BS but a lot say that Spark, and Ada, really crush the likelihood of bugs. There's no doubt that lots of things that just have to work, space shuttles, space stations, F-22, F-15 and lots of medical equipment use Ada or Ada Spark and have strong protections against surprise bugs. https://hackaday.com/2019/09/10/why-ada-is-the-language-you-want-to-be-programming-your-systems-with/ I found a video where Nvidia says they are going to move to all spark programming because of so many of their chips used in task critical areas like self driving. https://www.youtube.com/watch?v=2YoPoNx3L5E Here's a link on learning Ada and Spark, https://learn.adacore.com/ In the link below it gives a huge number of mission critical items that used Spark. https://en.wikipedia.org/wiki/SPARK_(programming_language) While Spark and Ada may not be glamorous or crafty like LISP they often work exactly as programmed without surprises the first time. Worth thinking about. I wonder are there tools that will accomplish the same thing with C++ or other languages without using Ada or Spark? https://www.hackster.io/adacore
>>25214 You know I think a lot of you Grommet, so please don't take this personally, but > Adafruit They are a great iconization of Leftists in general, and will be no friends to /robowaifu/ and our ilk once the 'rubber meets the road', so to speak. The name itself should be a big tipoff. Simply research the founding/history of the organization, Anon. Use them; that's fine as long as you take everything related to them with a big grain of salt. I'd personally much prefer Newark or Black Box for supply of electronics, etc. Not only are they much more well-established in the engineering domains, they are much less likely to be completely overrun by Filthy Commies. :^) > Ada I've mentioned the benefits of Ada numerous times here on the board; also why I feel it is primarily-disqualified for us. The hardware onboard aircraft doesn't 'magically' get safety just because Ada is being used there. It still requires strict engineering discipline and rigorous testing (at huge costs: Ada is a very expensive language to use in general). And no system is 100% safe. > I wonder are there tools that will accomplish the same thing with C++ Both D and C++ can be written in highly safe ways. However D has other challenges for us here, so I can't recommend it (at least yet). C++ has a very strong impetus for a pared-down, sensible/sane usage of the language known as the C++ Core Guidelines [1], which I intend to teach some of the most important basics of in our secondary classes here (including automated tools for checking CPPCG compliance). 1. https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines >=== -prose edit
Edited last time by Chobitsu on 09/05/2023 (Tue) 13:45:46.
>>25215 >don't take this personally No. I had heard of them(...a great iconization of Leftists...)but actually came at them through Ada. I found that the adafruit people somehow have some sort of affiliation, however tenuous, with Ada which I didn't know about. They link each other. The people that did Ada I think made it too much by committee. The Ada fanatics say the newer versions have fixed a lot of this. They have further fixed this, I believe, by paring it down to a subset called Spark. Probably the most retarded name for for software ever. How many things are called Spark? Way too many. When you say Ada is "expensive" I think of it totally differently. It's well thought out, takes a whole approach to things and is not a hacked up abomination that C is. (I've been reading the "Unix Haters Handbook" again) If these people had real smarts they would make a good higher level Spark program, like they have, then they would get some sort of miracle C/C++ hacker and make an extensive C/C++ program that would test all the basics of the registers and all other functions of a processor or microprocessor, (in C/C++), and then just read what it compiled to. Then equate the complied machine code results to Spark commands and back compile a compiler for every processor they could find while leaving the high level Spark tools and programming syntax in place. There's not enough compilers for Spark for all the processors and making them is not easy. Why they don't use computers to do the things they are supposed too, drudgery work, and automate it. I have no idea. Likely because the Ada people hate C and the C people hate Ada so nothing good ever happens. Worse is better. I fully, I believe, understand why you like C/C++ so much. It gives you complete control but unless you have years of experience it can cause all sorts of grief. Stuff like Spark and Ada are essentially boring as they do things the right way. Ever heard, I know you have, the https://en.wikipedia.org/wiki/Worse_is_better That's what C/C++ is, worse. Let's note that while C/C++ might be easy to whip something up. All the time you spent with all the gotchas and memorizing all the gotchas you could have spent a little more time with something like Ada and got it right in the first place. C/C++ only seem easier after spending years hacking away at it. Stuff like Ada make take longer to set up but only because you have to get it right or it just bleeps at you and tells you, you fucked up. C will take whatever you throw at it and promptly throw up bits on your keyboard.
>>25219 > conflating the ultra-pozz den Adafruit, with the 40+yo US DARPA/DoD overarching, sprawling, mandated, programming language (rapidly 'replaced' by the MIC industry using C & C++ instead, b/c of Ada's intractability fail -- especially during the first couple decades of the enforced debacle). < shiggy > I fully, I believe, understand why you like C/C++ so much. I don't 'like' C++ except insofar as it is literally the #1 most-likely means for /robowaifu/ 's successful & complete, systems software solutions; to give all of us entirely-unencumbered, practical, realworld robowaifus. > That's what C/C++ is, worse. With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly (cf., search your own use of the term 'worse': ITT ). By all means Anon, knock yourself out. You've expressed an interest in learning how to program software. Go ahead and tackle Ada as your language of choice. GCC offers a great front-end for the language via their amazing compiler system. [1] After enough experience, you too will understand why so few engineers willingly adopt the language on a personal level in actual production systems, and why it is literally one of the most expensive approaches to software development known today. > tl;dr Let's give any further programming language discussions between us a miss, friend Grommet. If you'd like to participate in our own /robowaifu/ programming classes here, and as long as you stay on-topic (ie, C++ programming, etc.), then of course that's an exception to this mandate regarding yourself. Cheers Anon, and kind regards. :^) 1. https://gcc.gnu.org/wiki/GNAT >=== -prose edit
Edited last time by Chobitsu on 09/07/2023 (Thu) 18:03:07.
>>25243 >With all due respect Anon, this is the literal last time I'm having this debate with you. Kiwi's right, you keep rehashing topics repeatedly Fair enough and a fair criticism. I'll try to take it to heart. I do make mistakes and also try to admit it when I do. My apologies.
I was looking into Elixir (lang) for creating my implementation of the cognitive architecture. That said, I keep an open mind and found this here for C++: https://github.com/endurox-dev/endurox https://en.wikipedia.org/wiki/Enduro/X >Enduro/X is an open-source middleware platform for distributed transaction processing. It is built on proven APIs such as X/Open group's XATMI and XA. The platform is designed for building real-time microservices based applications with a clusterization option. Enduro/X functions as an extended drop-in replacement for Oracle Tuxedo. The platform uses in-memory POSIX Kernel queues which insures high interprocess communication throughput. > ... AGPL It also has Python bindings. Hmm. Anyone worked with this?
>>25219 >>25214 I’m a little late (don’t have a lot of time to be checking robowaifu) but I think I can say something of value for Grommet. While good languages can have less foot guns and be designed to make mistakes harder. A language should not be chosen "for being safer" or for any other memed purpose. Are you picking Ada because it solves problems for you or are you picking it because it’s "safe and runs on my heckin airplanes TM"? >but unless you have years of experience it can cause all sorts of grief. This is true, that’s why I advice starting now and start clocking in hours and gain experience and confidence. Even if the language stops memory bugs, there are so many other mistakes can and will make. You can write good code in almost any language, Just pick a language and start writing, try to learn the language as well as you can, don’t be stuck in analysis paralysis. I guess there is some irony in me saying this in a thread called "Selecting a Programming Language" but I think it’s the most helpful thing to say. If you want safety I recommend you take a proven approach, Test everything!!! you need 100% coverage and the tests need to be good and not overfited. SQLite is a perfect example of this. It’s very reliable, runs on le hecking airplanes & is bug free while being written in pure C. it’s not a stagnant codebase of untouchable C, because the tests are good, they can afford to do large rewrites between even minor releases.

Report/Delete/Moderation Forms