General Robotics/A.I. news and commentary Robowaifu Technician 09/18/2019 (Wed) 11:20:15 No.404 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, or any social or economic issues surrounding it (especially of RoboWaifus). www.therobotreport.com/news/lets-hope-trump-does-what-he-says-regarding-robots-and-robotics https://archive.is/u5Msf blogmaverick.com/2016/12/18/dear-mr-president-my-suggestion-for-infrastructure-spending/ https://archive.is/l82dZ >=== -add A.I. to thread topic
Edited last time by Chobitsu on 12/17/2020 (Thu) 20:16:50.
345 posts and 146 images omitted.
>>16481 LOL. <inb4 le epin memery Just a quick note to let Anons know this thread is almost autosage limit. I'd like suggestions for the OP of #2 please. Thread subject (if you think it should be changed), OP text, links, pics?
>>16482 Would be cool to combine the usual with scaling hypothesis link https://www.gwern.net/Scaling-hypothesis and lore (maybe a single image with a mashup of DL memes) https://mobile.twitter.com/npcollapse/status/1286596281507487745 Also, “blessings of scale” could make it into the name
Open file (35.94 KB 640x480 sentiment.png)
>>16482 It might be good to have a thread dedicated to new papers and technology for more technical discussion that doesn't fit in any particular thread and another for more general news about robotics and AI. >>2480 I did some quick sentiment analysis back in April and there were more a lot more people positive about MaSiRo than negative. About a third was clearly positive and looking forward to having robowaifus but had similar reservations that the technology has a long way to improve before they would get one. Some said they only needed minor improvements and some were enthusiastic and wanted to buy one right away even with the flaws. Most of the negative sentiment was fear followed by people wanting to destroy the robomaids. Some negative comments weren't directed toward robowaifus but rather at women and MaSiRo's creator. And a few comments were extremely vocal against robots taking jobs and replacing women. Given how vicious some of the top negative comments were it's quite encouraging to see the enthusiasm in the top positive comments was even stronger. >>2484 Someone just needs to make a video of a robomaid chopping vegetables for dinner with a big knife and normies will repost it for years to come shitting their pants. Look at the boomers on /g/ and /sci/ that still think machine learning is stuck in 2016. If any meaningful opposition were to arise against robowaifus it would have to come from subculture given the amount of dedication it takes to build them. Most working on them have already been burnt by or ostracized from society and don't care what anyone thinks. They hold no power over us. So don't let your dreams be memes, unless your dreams are more layers, then get stacking. :^)
Open file (31.00 KB 474x623 FPtD8sBVIAMKpH9.jpeg)
Open file (185.41 KB 1024x1024 FQBS5pvWYAkSlOw.jpeg)
Open file (41.58 KB 300x100 1588925531715.png)
>>16482 This one is pretty good. We're hitting levels of AI progress that shouldn't even be possible. Now we just need to get Rem printed out and take our memes even further beyond. I'd prefer something pleasing to look at though than a meme since we'll probably be looking at it for 2+ years until the next thread. The libraries in the sticky are quite comfy and never get old.
>>16483 >Also, “blessings of scale” could make it into the name Not to be contentious, but is scale really a 'blessing'? I mean for us Anons. Now obviously large-scale computing hardware will play into the hands of the Globohomo Big-Tech/Gov, but it hardly does so to the benefit of Joe Anon (who is /robowaifu/'s primary target 'audience' after all). I feel that Anon's goals here (>>16496) would instead serve us (indeed, the entire planet) much, much better than some kind of always-on (even if only partially so) lock-in to massive data centers for our robowaifus. No thanks! :^) >>16487 >It might be good to have a thread dedicated to new papers and technology for more technical discussion that doesn't fit in any particular thread and another for more general news about robotics and AI. Probably a good idea, but tbh we already have at least one 'AI Papers' thread (maybe two). Besides, I hardly feel qualified myself to start such a thread with a decent, basic OP. I think I'll leave that to RobowaifuDev or others here if they want to make a different one. Ofc, I can always go in and edit the subject+message of any existing thread. So we can re-purpose any standing thread if the team wants to. >Given how vicious some of the top negative comments were it's quite encouraging to see the enthusiasm in the top positive comments was even stronger. At least it looks to be roughly on-par, even before there is a robowaifu industry in existence. Ponder the ramifications of that for a second; even before an industry exists. Robowaifus are in fact a thousands-years-old idea whose time has finally come. What an opportunity, what a time to be alive! :^) You can expect this debate to heat up fiercely once we and others begin making great strides in a practical way, Anon. >Someone just needs to make a video of a robomaid chopping vegetables for dinner with a big knife and normies will repost it for years to come shitting their pants. This. As I suggested to Kywy, once we accomplish this, even the rabid, brainwashed feminists will be going nuts wanting one of their own (>>15543).

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/28/2022 (Sat) 16:07:14.

Robowaifu-OS & Robowaifu-Brain(cluster) Robowaifu Technician 09/13/2019 (Fri) 11:29:59 No.201 [Reply] [Last]
I realize it's a bit grandiose (though probably no more than the whole idea of creating a irl robowaifu in the first place) but I want to begin thinking about how to create a working robowaifu 'brain', and how to create a special operating system to run on her so she will have the best chance of remaining an open, safe & secure platform.

OS Language Choice
C is by far the single largest source of security holes in software history, so it's out more or less automatically by default. I'm sure that causes many C developers to sneer at the very thought of a non-C-based operating system, but the unavoidable cost of fixing the large numbers of bugs and security holes that are inevitable for a large C project is simply more than can be borne by a small team. There is much else to do besides writing code here, and C hooks can be generated wherever deemed necessary as well.

C++ is the best candidate for me personally, since it's the language I know best (I also know C too). It's also basically as low level as C but with far better abstractions and much better type-checking. And just like C, you can inline Assembler code wherever needed in C++. Although poorly-written C++ can be as bad as C code in terms of safety due to the necessity of it being compatible with C, it also has many facilities to not go there for the sane coder who adheres to simple, tried-and-true guidelines. There is also a good C++ project already ongoing that could be used for a clustered unikernel OS approach for speed and safety. This approach could save drastic amounts of time for many reasons, not the least of which is tightly constrained debugging. Every 'process' is literally it's own single-threaded kernel, and mountains of old-style cruft (and thinking) typical with OS development simply vanishes.

FORTRAN is a very well-established language for the sciences, but a) there aren't a lot of FORTRAN coders, and b) it's probably not the greatest at being a general-purpose language anyway. I'm sure it could be made to run robotics hardware, but would probably be a challenge to turn into an OS.

There are plenty of dujour SJW & coffee languages out there, but quite apart from the rampant faggotry & SJWtardism plainly evident in most of their communities, none of them have the kind of industrial experience and pure backbone that C, C++, or FORTRAN have.

D and Ada are special cases and possibly bear due consideration in some year's time, but for now C++ is the obvious choice to me for a Robowaifu OS foundation, Python probably being the best scripting language for it.

(1 of 2)
55 posts and 19 images omitted.
>>13174 lel'd. >How do I Well, you start by not letting her get behind the wheel at night anon.
>>201 https://www.mythic-ai.com/technology/ https://youtu.be/GVsUOuSjvcg relevant and of interest for AI computing technology.
>just dropping this here for refs: Operating Systems: Three Easy Pieces Remzi H. Arpaci-Dusseau and Andrea C. Arpaci-Dusseau Arpaci-Dusseau Books August, 2018 (Version 1.00) >"The book is centered around three conceptual pieces that are fundamental to operating systems: virtualization, concurrency, and persistence. In understanding the conceptual, you will also learn the practical, including how an operating system does things like schedule the CPU, manage memory, and store files persistently. Lots of fun stuff! Or maybe not so fun? https://pages.cs.wisc.edu/~remzi/OSTEP/
>>201 tbh as a side project I would be interested if you succeeded in buying a few cheap Intel Xeon Phis from ebay and integrating them into a system with non-trivial performance. But my main platform is gaming GPUs for now.
>>16469 Not a bad idea Anon. I think the main point to start with is maybe a basic 4-SBC cluster simply to provide cheap+energy-efficient failover safety for our starter robowaifus. Thereafter, the sky's the limit so yea.

Open file (410.75 KB 1122x745 birbs_over_water.png)
Open file (99.96 KB 768x512 k0p9tx.jpg)
/robowaifu/meta-5: It's Good To Be Alive Robowaifu Technician 03/07/2022 (Mon) 00:23:10 No.15434 [Reply] [Last]
General /robowaifu/ team survey: (>>15486) Please respond Anon /meta & QTDDTOT Note: Latest version of /robowaifu/ JSON archives available is v220523 May 2022 https://files.catbox.moe/gt5q12.7z If you use Waifusearch, just extract this into your 'all_jsons' directory for the program, then quit (q) and restart. Mini-FAQ

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/24/2022 (Tue) 05:08:32.
210 posts and 75 images omitted.
Open file (175.19 KB 235x480 Lapis-figurine.png)
I've just bought a 3d printer (Creality Ender 3 pro as my first printer) https://www.creality.com/products/ender-3-pro-3d-printer I will most likely pick it up on friday. I already have filaments which I got for free from a convention If it works out well I'll be expanding on it
>>16461 Congrats Hik! That's a nice one IMO. Looking forward to your printout pics! >figurine A Cute! I'm really looking forward to an Aoki Lapis robowaifu Anon. :^) waifusearch> Ender THREAD SUBJECT POST LINK 3D printer resources -> https://alogs.space/robowaifu/res/94.html#8850 ender " -> https://alogs.space/robowaifu/res/94.html#15714 " Prototypes and failures -> https://alogs.space/robowaifu/res/418.html#15926 " " -> https://alogs.space/robowaifu/res/418.html#15930 " Short Stacks the Obvious Solutio -> https://alogs.space/robowaifu/res/2666.html#2666 " /robowaifu/meta-4: Rugged Mounta -> https://alogs.space/robowaifu/res/12974.html#15333 " The Sumomo Project -> https://alogs.space/robowaifu/res/14409.html#15256 " kiwi's Tutorials -> https://alogs.space/robowaifu/res/14704.html#14711 "

Message too long. Click here to view full text.

Open file (1.96 MB 1288x966 3dprinter-assembled.png)
Open file (1.58 MB 1159x758 3dprinter-parts.png)
I got and assembled my 3d printer today Took about 3 hours to put it together I have yet to configure it and setup software for it and such, will get to that soon, then I'll make test prints and tweak slic3r settings and so on
>>16494 Congrats! And welcome to 3d-printer calibration technicians' club! You have to choose slicer (cura, slic3r) and a CAD (solvespace, freecad, cadquery, openscad or something else) to continue.
>>16494 Gratz Hik! May your 3D-printing adventures go smoothly. Cheers.

Open file (37.41 KB 1213x234 oof.png)
Julay / Alogs.space / Guntretort Joshua Clayton Connor 05/05/2022 (Thu) 12:11:33 ID: 4bb6d7 No.147784 [Reply]
Julay was the prime time muster bunker after the downfall of 8chan with by far the most boards and activity; a good deal of 8ch refugees managed to find their way to it and haphazardly re-establish their communities, it even sported a beloved vidya board. But it didn't luck out with administration and the absolute retards running it, among other things, decided to delete 90% of the boards so that people "went and made their own instead of relying on one single site". As a result all those people, those communities that already lost about 60% of their userbases to 8ch going down, left the webring entirely unable to survive getting kicked out yet again. Some going to cuckchan, some scattering in the wing forever. Now julay is a completely dead site existing pretty much solely for /cow/'s one active board. Even the few boards that were allowed to stay now left for other sites on the webring. Julay is definitely a lolcow of the webring.
24 posts and 10 images omitted.
Open file (81.24 KB 301x304 Niggerchu.png)
Julay is a gay board, full of fat, autistic pedophiles, and N​IGGERchu would not fuck anybody here So has N​IGGERchu spoken
>>150354 >pedochu implying he is gay
webring is a horrible idea and you are all fags
>>150691 Chadring is the best thing that ever happened to the internet. Your faggy site will never be accepted. Cope.
>>150231 Yes, he's a zoomer (or considers himself one even though he's older like Warhammer does) and used to get really butthurt with vol3 by delete his shitty threads or his lolis.

Open file (207.63 KB 550x1361 SmegmaButterBattle (1).png)
Open file (421.25 KB 489x559 buttergangsta3ab.png)
Open file (313.47 KB 1170x811 image0 (16).jpg)
Fuzhou the Buttergolem/Buttermunculus/Hapabutter/Butterboy Toad 08/30/2021 (Mon) 19:24:05 No.117126 [Reply] [Last]
Have (you) ever wondered what it would be like when it got to the point that Dominic Vanner of all cows would have his own a-log, a freakish abomination of soy and butter that was worse than him in almost every respect? The pathetic golem of butter, Fuzhou, has finally fulfilled that role and exceeded expectations in every way possible. Fuz is a julaytoasted grifter trying to pick up the crumbs left by Jewsh Moon, his views are fat single ladies into internet drama, those being 95% of his views. Like a lot of drama grifters, Fuz is a lolcow by his own right thanks to his personal life and the humiliation he went online. Being a spineless golem, Fuz make two hours streams reading foxdickfarms threads or wikipedia to get his "cocks" hoping to make some money to buy sex toys to his Onlyfans "girlfriend", more weed and funko pops for himself and finally have money enough to leave his parents house in his middle 20's. But Fuz channel is far from being interesting or the focus to his alogs. Even though he's a black hole of humor, Fuz has become a fun figure to watch and debate bickers of his own dramas, failures and actions, catching the attention of certain groups and even ecelebs. At first glance, Fuzhou appears to be just an unfunny grifter with zero talent and is seen as just one of so many julaytoasted wannabie Mister Metokurs clones that have plagued this side of the internet in recent years. But a deep look, penetrating the layer of fat and boredom that surrounds him reveals a soulless automaton, but able to feel at some level a certain self hate for himself and for the world. In a philosophical way, he is able to become interesting. Do Golems dream of butter sheep? Is Fuz a sample of what the kikes want for the future? Maybe he's the next step of the mankind, the future nu-male, Homo Consumat Butyrum. Vaxxed, cucked, only caring about consooming and extremely submissive. A ruined and terrible form of life, a Butter Golem. Clips of Fuzhou for (you)r consideration: Fuzhou Getting btfo'd by Dominic Vanner: https://yewtu.be/watch?v=e7LSOZ7Rq0w https://yewtu.be/watch?v=9p3HyPXixrE On1yfans Buttercaust: https://yewtu.be/watch?v=W_wJ-0HxBVc

Message too long. Click here to view full text.

Edited last time by oldestfag on 01/22/2022 (Sat) 23:08:48.
372 posts and 114 images omitted.
What happened with the Butter other than him back stabbing the Гунт and trying to delete his past? he still reporting emptyhero discord server or being part of the streams?
>>148804 Fuzzy talked about how he was okay about diddling as long as the diddler was a woman.
Open file (43.66 KB 629x356 basedsargoon.jpg)
>>149604 >it depends on the gender
>>148804 I honestly don't even watch Empty anymore, I thought he stopped streaming to dedicate himself to his job and shitty books. It's amazing how he's had a massive drop in quality and performance since Fuz came into his life. I hope Fuzhou does good titsjobs with his butter mantits... otherwise this friendship is inexplicable.

Open file (41.41 KB 1280x720 Michael.jpg)
Michael Does Life, the incel avatar of /v/ Pollythefrog69 04/18/2022 (Mon) 14:43:46 ID: 56a880 No.144416 [Reply]
Michael also knew as Mikael or Mickal by is fans is a almost 40 years old incel vidya critic best known to normies for being the man behind the early criticisms of Cyberpunk 2077 when everyone was still hyping the game. Michael constantly cries about not being credited for this and has a personal crusade versus CD RED Projekt. Michael also has a vendetta with ReviewTechUSA when Michael used to be a fan of the show he got inviteded for a live stream, only to be disavowed by ReviewTechUSA on account of the fans of Michael being 4chan neo-Nazis. Michael is known for being one of the few left youtubers who let anyone call his show, including yelling "NlGGERS, KILL ALL NlGGERS" while trying to talk about vidya. Only to have mental breakdowns live about not being taken seriously and having useless fans from /v/. https://www.youtube.com/watch?v=O23_h1bF7V4 Michael used to had a bf named Dillon in the Details, they both streamed together reviewing games until they had a fight about whether or not they liked a certain game, separating them both https://youtube.fandom.com/wiki/Michael_Does_Life Channel> https://www.youtube.com/c/MichaelDoesLife/videos Channels who mirror some of his deleted videos and streams>https://www.youtube.com/channel/UCFFbeisumgom2Mz1LRfTOSw/videos https://www.youtube.com/playlist?list=PLDl1Lb8EU58rzVa0ihItM6SJTHcx118e1 His twitter> https://twitter.com/michaeldoeslife
1 post omitted.
From time to time I watch Michael, if before he just played role mixing it with irony and self hatred now he's getting a bit crazy, he's doing these weird streams at night singing or just staring at the cam without saying anything for hours a d changing his personal opinions >>144786 >torpedo from zzzchan being butthurt and pretentious Nothing new under the sun
Mickael is getting old with his gimmick always being the same, and he's feeling it and getting more and more depressed and angy with random people. Also, even him is feeling like it's fuck up that his main fan atm is a boy in his teenager years calling to the show and sucking his dick.
Michel will have his first date in his incel life and it will be online to everyone to watch https://www.youtube.com/watch?v=qH7JYJgvYK8
I got banned from posting on his channel, Mickael is having a mental breakdown bickers of Starfild and his e-date
>>148801 His "dated" failed, since then Michael is having an emotional crisis. 1-The exciting new cocks he promised comes down to him eating mcnuggets or other garbage fastfood. 2-Got a new haircut and decided to backstab xbox fanboys while having rants about incels 3-Deleted more livestreams 4-Now he's saying he's going to become an SJW and only stream on twich bickers he's tired of edgy fans.

Poseidon, former right-winger grifter and Pagan LARPer / Neet shitskin and schizo Brian Holloman 03/19/2022 (Sat) 23:19:42 ID: 90501c No.141211 [Reply]
Poseidon is one of those many grifters who started getting into the "redpill" bus around 2016 by making videos about immigration in Europe. As "greek-sicilian" he was said to have a more European opinion than the average right-wingers from UK, German and USA about all subjects. However, he never became popular in circles like 4chan /pol/, with only 8chan /pol/ paying any attention to him and a few European anons. In 2018 he had his former channel take down, creating a second channel (his actual one). But since 2019 he started to show himself to be a little less mature and socially successful as he claimed. During weekends, he started doing livestreams called "Whitepilled weekend" in which it constituted of him quietly reading news with his heavy accent while free-copyright music played in the background. Note that as a European, he started these streams every weekend around 2am in European time. Over time, these livestreams began to gain a share of trolls, mocking the fact that Poseidon, who was said to be a mature man, of family, spent every weekend awake hoping to get donations and seething about any jokes about him on chat. After 8chan /pol/ notice that Poseidon didn't take criticism well and was openly banning anyone with the slightest criticism of him the views started to drop and more trolls started to show up. Over time, he stopped those livestreams, according to him, the Catholic Church was being responsible for his troll problems, by sending the trolls together with american Daily Stormer trolls. With that, his first signs of schizophrenia and narcissism showed. In 2020 Poseidon has embarked on the covid wave, and started bleeding views and subscribers. His videos about immigration and politics in Europe stopped and he started to make videos saying how Donald Trump is a massive loser retard and same about any people who supported him, to the point of making a sequence of videos proving that Biden won the election legitimately. Interspersed with videos about Biden being a "chad", he made multiple videos about covid restrictions and vaccine. In these videos, he was totally in favor of forced vaccination, use of masks and severe restrictions. With that, his views, which he now denounces as "far-right" and "Crazy Christians" started jumping off the boat and making serious criticisms about him. How Poseidon reacted? He started replying, comment by comment, personally attacking anyone criticizing him. He would look at anyone's Youtube channel criticizing him and find any detail about the person's personal life or their taste in games, music, etc. He also obsessively started attacking any Americans or Christians in the comments of his videos even if some of those didn't show any criticism about him. Some of the videos around this time had their comment section locked. At the same time, he made a video saying that the right-winger movement was being controlled by the Catholic Church, KGB and American elite. That it would be impossible for Europe to prosper as long as Christians lived in Europe or people voted for politics with any connection with Russia or USA, once again attacking any European politics who have met at any time in their lives with Putin or Trump or are against covid restrictions. Even his old opinions about immigration changed, now he denounces "White Genocide" as a conspiracy theory made up by crazy far-right incels and says the Elite is right to want to have more low-cost employees in Europe and North America and Christians are just worried about being replaced with Muslims and don't care about anything else. With that, he lost around 3k subscribers in this year so far, with downvotes being predominant. So he started making videos claiming he was being attacked by NeoNazis and Christians who had the goal of dox him and his family, although there is no evidence that anyone really cared enough about him to do so. How the far-right are the true terrorists when compared to Antifa and he was being unfairly attacked by extremists and genocidal idiots who wish for the end of the world instead of getting a vaccine. From above 25k subscribers, now his channel has less than 6k subs.

Message too long. Click here to view full text.

29 posts and 24 images omitted.
>>143702 <clicks link Jesus Christ, the absolute cringe.
None of his fans are even real they are sock accounts.
Open file (67.71 KB 928x526 ClipboardImage.png)
He's making those shitty shorty videos with "evidences" of why X-person is a russian asset. He made two or three about Trump being a KBI minion, but he deleted bickers he's a coward grifter after all. Instead, he's once again trying to grift toward Spic Fuentes or other figures. https://www.youtube.com/watch?v=tdbYFl3gxdw&ab_channel=POSEIDON THE TRUE REDPILLED AND ILLUMINATED GRIFTER IN POLICIES! Since all the Ukraine shit is barely getting views now from anyone trying to make money with it let's see what's next. Maybe will be another cry about anti-vaxxers.
wtf is this guy problem? he keeps deleting videos and seems to take his youtube life very serious
>He's now trying to copy Memeology 101 style of videos

DJ Axle / Phantom Organization / Jester69 / Aksel Birger Enstad JDANKS420 09/29/2021 (Wed) 06:12:11 ID: 46f067 No.121078 [Reply]
PO is a furry fagget incel NEET tug-boat-having kid-touching attention whore with a race-mixing sister. He came to prominence by stream-sniping the Гунт but is too lazy to even do that for very long. He just begs for schekels on his shitty furry cringe stream that no one watches and posts furry porn in his server where his last few friends are. Now he is chimping out on discord and no one can figure out exactly why. Seems to be some sort of autistic meltdown. Racemixing julay truck sister: Hanne Engstad https://archive.md/8HJfU https://mobile.twitter.com/HEngstad https://archive.is/2nbBZ DOX Aksel Birger Engstad 99 35 29 03 Engasvegen 320, 7900 R0RVIK
32 posts and 32 images omitted.
I honestly want to fuck Aksel in the ass
Open file (1019.37 KB 720x400 JonTron_But_why.webm)
>>140395 i bet he'd cum handsfree like the little slut he is
Based Axle calling out the grifters and PLEASE STOP TROLLING BRITBONG, HIS PARENTS ARE BEATING HIM AND HE WON&apos;T STOP SCREAMING AT US IN DISCORD'ing Fagmencope https://youtu.be/FZ4xoptlm0M Alliance of Good = Surfer, Gahoole, P.O, MrDeadman and TheDaiymo

Open file (8.77 MB 1280x720 pedo_reveal.mp4)
Open file (367.36 KB 819x895 chef_john.png)
Open file (54.12 KB 186x246 pedobrain.png)
Open file (218.19 KB 561x649 pedojust.png)
John Barnhill/koirey Barnhill/Koisheep/kimeemaru Anonymous 08/01/2020 (Sat) 06:39:16 No.70142 [Reply] [Last]
Names: John, real first name John Barnhill, Koirey Barnhill, Van, former twitch name Koisheep, Kimeemaru, prinny, dr oktoberfest last 2 are old steam names John Barnhill, otherwise known as the long lost brother of Corey Ray Barnhill is a 22 year old or 21 fat jewish leaf pedophile NEET who lives somewhere in Ontario, leafland. most likely Hamilton He is a newfag who got into image boards in 2015 during the tale-end of #gaymergoyim, this is what in his own words truly inspired him to get involved in the community/get political. He was an underagefag or a "teenbro" (in his own words) for most of his time on image boards. Besides being the pedophile that tried to spam Julay to death with cp He is most notably a failed z-celeb streamer and speedrunner who was so desperate for fame that he went to the Гунтstream of all places to garner a larger fanbase. went from 0-2 viewers to 6 fucking amazing improvement john! THE FAMILY John also has an amazing family that does not at all explain the fact that he was allowed 24 hour access to the internet as a teenager that did not at all fry his brain with cuckime and turn him into the sad pedophile that he is today. Johns family consist of him, his father, who is also an autistic NEET who watches interracial cuckold porn, smokes weed, plays video games and bullies john all day irl and in vidya; his mother who is the only member of the family with a job and who supports the entire household by herself; his onee-san who is a fat 28 year old with one son. her husband is a chubby chaser apparently according to John John's last family member is his imouto, a 17 year old autistic girl who sucks at 2hu and apparently plays with Legos, she drives him around everywhere he goes bickers he doesn't know how to drive. TIME ON THE ГунтSTREAM John Barnhill only came on the Гунтstream for clout at the recommendation of his irl friend prmanager, a massive faggot in his own right he knew little to nothing about lolcows, the Гунт or anything else and he had little interest in it. prmanager seems to know a little but I suspect he is a foxdick John would stream himself playing video games in which he was constantly bullied by a-logs for his lack of the ability to play them. John is only good at 2hu and some quake-rip off free to play fps, he absolutely sucks at everything else. John in particular is really bad at platformers like Mario and Megaman. besides streaming "games" he would also stream his "lovequest" member chris chan for an e-girlfriend, highlights of these streams include meeting a girl that revealed herself to be 13: >uhh sorry lad I can't do anything with you >you are underage sorry >like sorry but you are underage >yeah I know but I can't date you bickers your 13 your underage Despite this he hung around her a little longer instead of instantly leaving, and sounded almost like he was lamenting the fact that he couldn't legally have e-sex with this innocent girl, this is one of the first signs of john's pedophilia but its not the last. John tried courting one girl but she turned out to be a based chink who only dated in her race and so the kike was stood up yet again. >so uhhh ya I kinda need an e-girl girlfriend >oh you do?

Message too long. Click here to view full text.

635 posts and 338 images omitted.
Open file (152.60 KB 1494x317 ClipboardImage.png)
The samefag from this post >>149625 and also multiple ones on tvch is now larping as essayfag. He's the same guy who avatarfag and cry about Gahoole, /cow/ and vol3, but now he's trying to act like a sucker and pretending to be essayfag to stirshit.
>>150993 G​AMERGATEchu is always lurking this thread, he's already seething about your post
>>150993 >>151393 You are a legit asshurt spastic. Not only did you avoid responding to my post, the one you screencapped, but you came here to kvetch about it like a scorned woman and concoct some gay and cringe fanfiction about me.
Open file (157.96 KB 1457x601 pedo oldfag btfo dup.png)
Open file (124.61 KB 854x280 schizo lolicon.png)
Open file (313.21 KB 1525x712 schizo lolicon rant.png)
>>150993 >>151393 Here is average posts on tvch.
Open file (204.10 KB 1449x736 pedo oldfags.png)

/robowaifu/ + /monster/, its benefits, and the uncanny valley Robowaifu Technician 05/03/2021 (Mon) 14:02:40 No.10259 [Reply]
Discussing the potential benefits of creating monster girls via robotics instead of 1 to 1 replicas of humans and what parts can be substituted to get them in production as soon as possible. Firstly is the fact that many of the animal parts that could be substituted for human one are much simpler to work with than the human appendages, which have a ton of bones and complex joints in the hands and feet, My primary example of this is bird/harpy species (image 1), which have relatively simple structures and much less complexity in the hands and feet. For example, the wings of the bird species typically only have around three or four joints total, compared to the twenty-seven in the human hand, while the legs typically only have two or three, compared to the thirty-three in the human foot. As you can guess, having to work with a tenth of the bones and joints and opposable thumbs and all that shit makes things incredibly easier to work with. And while I used bird species as an example, the same argument could be put forward for MG species with paws and other more simplistic appendages, such as Bogey (image 2) and insect hybrids (image 3). Secondly is intentionally making it appear to not be human in order to circumvent the uncanny valley. It's incredibly difficult to make completely convincing human movement, and one of the simplest ways around that is just to suspend the need for it entirely. We as humans are incredibly sensitive to the uncanny valley of our own species, even something as benign as a prosthetic limb can trigger it, but if we were to create something that we don't expect to move in such a way, it's theoretically entirely possible to just not have to deal with it (for the extremities part of it, anyways), leaving more time to focus on other aspects, such as the face. On the topic of face, so too could slight things be substituted there (again for instance, insect girls), in order to draw attention away from the uncanny valley until technology is advanced enough that said uncanny valley can be eliminated entirely. These possibilities, while certainly not to the taste of every anon, could be used as a way to accelerate production to the point that it picks up investors and begins to breed competition and innovation among people with wayyyyyyy more money and manpower than us, which I believe should be the endgoal for this board as a whole. . Any ideas or input is sincerely appreciated.
20 posts and 8 images omitted.
>>13697 There are Anon's here working on making monster girls real. Monster and robowaifu make sense together.
>>13697 Imagine, being so upset about one guy writing something in an online community. In a thread about monster girl robots. Dude.
>>13698 As you think >>13699 I will get mad on what I want.
Open file (490.38 KB 525x910 loona.png)
>put robo-skellington in life-size plush >Uncanny valley solved ? You don't have to worry about replicating skin and it's more huggable than hard plastic
>>16492 Yep, good thinking Anon. And actually, we've had similar concepts going here for quite some time actually. waifusearch> plush OR plushie OR daki OR dakimakura THREAD SUBJECT POST LINK AI Design principles and philoso -> https://alogs.space/robowaifu/res/27.html#27 dakimakura What can we buy today? -> https://alogs.space/robowaifu/res/101.html#101 " Who wouldn't hug a kiwi. -> https://alogs.space/robowaifu/res/104.html#6127 " " -> https://alogs.space/robowaifu/res/104.html#6132 " " -> https://alogs.space/robowaifu/res/104.html#6176 plushie " -> https://alogs.space/robowaifu/res/104.html#14761 daki Waifus in society -> https://alogs.space/robowaifu/res/106.html#2267 dakimakura Robot Voices -> https://alogs.space/robowaifu/res/156.html#9092 plushie " -> https://alogs.space/robowaifu/res/156.html#9093 " Waifu Robotics Project Dump -> https://alogs.space/robowaifu/res/366.html#3501 daki Robowaifu Propaganda and Recruit -> https://alogs.space/robowaifu/res/2705.html#2738 " /robowaifu/ Embassy Thread -> https://alogs.space/robowaifu/res/2823.html#10983 plushie

Message too long. Click here to view full text.


Open file (363.25 KB 1027x1874 MaidComRef.png)
MaidCom Development Kiwi 03/16/2022 (Wed) 23:30:40 No.15630 [Reply] [Last]
Welcome to the /robowaifu/ board's project. Together, we will engineer a modular robot that will serve and provide companionship to their Anon faithfully. See picrel for details on the current design. This robot will begin with a basic maid robot then move forward towards more capable robots with functionality approaching Chii/2B/Dorothy. First goal is to have a meter tall robot which functions as a mobile server bearing an appearance that approximates her owners waifu. This should be completed by December 2022 with further major updates happening yearly until Anons can build a PersoCom class robot with relative ease and affordability.
162 posts and 81 images omitted.
Kywy, Anon just posted a video clip that could have a potential for our meshtegrity approach for robowaifus: > (>>16415 >Tendon-driven leg -related)
Open file (149.69 KB 634x1535 compositeic1.jpg)
>>16374 ROS has a lot of "documentation" but it actually says very little. I only found the source code because I looked at the Ubuntu compilation instructions which suggested adding a source repository to Ubuntu and the repository was hosted on Github. I'm wary of that organization. I jumped on the source code of one of their C projects and I see a load of useless wrappers around stdlib functions https://github.com/ros2/rcutils/blob/master/src/strcasecmp.c https://github.com/ros2/rcutils/blob/master/src/strdup.c Every line of code has a cost, adding lines of code to add argument checking to the callee isn't worth it. Programmers will have to lose time learning those wrappers, using them will tie programs to ROS, and they don't actually provide any benefit: you still have an error you have to check if you pass null pointers to those stdlib wrappers. The other thing that code does is let the caller specify the memory allocator. I doubt they found an use for a custom memory allocator in their stdlib wrappers. Most likely they're just losing memory and performance by adding more parameters and layers of pointers to dereference to functions. If you check at the callee, you still have to check at the caller, because then you have to check if the callee returned error. Worse, the errors are only going to pile up, and you'll have to come up with increasingly complex mechanisms to signal what kind of error happened. As errors from one part of the program are propagated to increasingly distant places, it will become impossible to handle them. If there is no way to signal what kind of error happened as is the case here, error checking paradoxically becomes impossible because someone added error checking to a function. If you check at the caller, the callee doesn't have to check anything, and every function it passes the argument to doesn't have to do any checking either. The errors aren't allowed to snowball, keeping error checking to a minimum, and errors are only checked where they might be acquired, keeping them close to their source. All of this without letting any error go unchecked. This is also way less lines of code, so it results in less code to maintain, smaller binaries, and less chances of bugs cropping up because you can't have bugs if you don't have code. Errors should be checked where they might be acquired. Also, type names ending with _t are reserved by POSIX: creating one such type is UB, though it's unlikely to cause trouble. And this memcpy call https://github.com/ros2/rcutils/blob/master/src/strdup.c#L52 copies 1 extra byte unnecessarily.

Message too long. Click here to view full text.

>>16231 How are you doing, Ricardo? You still with us bro? Just checking up on you, seems like it's been a couple weeks since the team has heard from you. Cheers.
>>16394 Thank you and not so much. My connection got cancelled due to freak snow and I was stranded in Denver for 4 days. Made the most of it but only the other night just arrived back home and still have the remainder of a full workweek. What can I say when it rains it pours (snows?)
>>16484 4 days? Wow, that sucks. Anyway, glad you're back safe and sound Meta Ronin.

The Sumomo Project Chobitsu Board owner 11/24/2021 (Wed) 17:27:18 No.14409 [Reply] [Last]
So I've been working for a while at devising an integrated approach to help manage some of the software complexity we are surely going to encounter when creating working robowaifus. I went down many different bunny trails and (often) fruitless paths of exploration. In the end I've finally hit on a relatively simplistic approach that AFAICT will actually allow us to both have the great flexibility we'll be needing, and without adding undue overhead and complexity. I call this the RW Foundations library, and I believe it's going to help us all out a lot with creating workable & efficient software that (very hopefully) will allow us to do many things for our robowaifus using only low-end, commodity hardware like the various single-board computers (SBCs) and microcontrollers. Devices like the Beaglebone Blue and Arduino Nano for example. Of course, we likely will also need more powerful machines for some tasks as well. But again, hopefully, the RW Foundations approach will integrate smoothly with that need as well and allow our robowaifus to smoothly interoperate with external computing and other resources. I suppose time will tell. So, to commemorate /robowaifu/'s 5th birthday this weekend, I've prepared a little demonstration project called Sumomo. The near-term goal for the project is simply to create a cute little animated avatar system that allows the characters Sumomo and Kotoko (from the Chobits anime series) to run around having fun and interacting with Anon. But this is also a serious effort, and the intent is to begin fleshing out the real-world robotics needs during the development of this project. Think of it kind of like a kickstarter for real-world robowaifus in the end, but one that's a very gradual effort toward that goal and a little fun along the way. I'll use this thread as a devblog and perhaps also a bit of a debate and training forum for the many issues we all encounter, and how a cute little fairybot/moebot pair can help us all solve a few of them. Anyway, happy birthday /robowaifu/ I love you guys! Here is my little birthday present to you. === >rw_sumomo-v211124.tar.xz.sha256sum 8fceec2958ee75d3c7a33742af134670d0a7349e5da4d83487eb34a2c9f1d4ac *rw_sumomo-v211124.tar.xz >backup drop

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/20/2022 (Fri) 20:36:38.
131 posts and 88 images omitted.
Open file (716.84 KB 500x281 shaking.gif)
Hello, newfag here. I have been lurking here for a bit. I am new to programming and C++ in general, so an experienced programmer would probably get a stroke looking at my code, but it seems to work and that is all I am looking for at this point. Today I finally got back propagation working on neural network library in C++ that I was working on. So far I have created a CSV file parsing, as well as matrix operations library using std::vector. The neural nets use matrix algebra to compute the outputs. It can initialize a random neural network with weight between -1 and 1 using 2 vectors, 1st to determine how many neurons in each layer, 2nd to determine the type of neuron in each layer(different activation functions), as well as Nx the number of input variables. There are also functions to scale input and output data between 0 and 1 for faster input. It I have finally got the 1st order training method of Gradient descent to work, with OK results, the training dataset isn't too large and IDK if I am using the correct network layouts, but the outputs seem somewhat on the mark. I need to learn a lot more about layouts of neural nets. The next step is a lot more testing, and adding the more efficient 2nd order methods, which will involve computing the hessian and jacobian matrices. (RIP my brain) After I get these 2 methods working, and then get into actual chatbot learning to utilize these neuralnets for my waifubot. I will post updates if that is OK. My test folder: https://anonfiles.com/[redacted]/test.tar_gz === Note: It's bad form generally (and especially on IBs) to include anonymous precompiled binary executables inside a drop, Anon. Give it a shot making another drop that's just the source code, build files, and dataset files. Please post that instead, and I'll fix up your post here with the new droplink. Cheers. >t. Chobitsu >=== -add/edit admonition cmnt
Edited last time by Chobitsu on 05/24/2022 (Tue) 22:42:37.
>>16408 Sorry, but it's better if you create the repository, you're the author, and I'm not looking for more responsibilities.
>>16451 Understood. Well, as I indicated ATM /robowaifu/ is my 'repository'. It certainly affords us a reasonable degree of communication. If you can find some way to dist + push patches up to catbox or some other file service, I'm sure we can get by.
>>16455 Should be doable with any of a git hook, gitlab action, or meson target. I've done tangentially related stuff but not this in particular. Either way, I'm already working with 2 dependencies /robowaifu/ uses to fix a bug and add a way to use Meson's dependency() so I already found something to work on.
>>16489 Excellent. Thanks Nagisa, your help will be very welcome! Cheers.

Open file (14.96 KB 280x280 wfu.jpg)
Beginners guide to AI, ML, DL. Beginner Anon 11/10/2020 (Tue) 07:12:47 No.6560 [Reply] [Last]
I already know we have a thread dedicated to books,videos,tutorials etc. But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. That is why I thought we would need a thread like this. Assuming that I only have basic programming in python, dedication, love for robowaifus but no maths, no statistics, no physics, no college education how can I get advanced enough to create AI waifus? I need a complete pathway directing me to my aim. I've seen that some of you guys recommended books about reinforcement learning and some general books but can I really learn enough by just reading them? AI is a huge field so it's pretty easy to get lost. What I did so far was to buy a non-english great book about AI, philosophycal discussions of it, general algorithms, problem solving techniques, history of it, limitations, gaming theories... But it's not a technical book. Because of that I also bought a few courses on this website called Udemy. They are about either Machine Learning or Deep Learning. I am hoping to learn basic algorithms through those books but because I don't have maths it is sometimes hard to understand the concept. For example even when learning linear regression, it is easy to use a python library but can't understand how it exactly works because of the lack of Calculus I have. Because of that issue I have hard time understanding algorithms. >>5818 >>6550 Can those anons please help me? Which resources should I use in order to be able to produce robowaifus? If possible, you can even create a list of books/courses I need to follow one by one to be able to achieve that aim of mine. If not, I can send you the resources I got and you can help me to put those in an order. I also need some guide about maths as you can tell. Yesterday after deciding and promising myself that I will give whatever it takes to build robowaifus I bought 3 courses about linear alg, calculus, stats but I'm not really good at them. I am waiting for your answers anons, thanks a lot!
58 posts and 102 images omitted.
>>16420 Neat, thanks for the info Anon.
Hey Chobitsu! I am glad to see you again, yeah please go on and fix the post. > So thanks! :^) If it wasn't for you and the great contributors of the board, I would not have a place to post that so I thank you! And the library thread was really necessary, I wish that the board had a better search function as well. I was trying to find some specific posts and it took me a long while to remember which threads they were on. > so maybe we can share notes My University provided me with a platform full of questions. So basically, they have like 250 types of questions for precalculus for instance. The system is automated and what it does is to generate infinite number of questions of that specific type of question and also explains the solution for every question. It changes the variables randomly and gives you space to solve as much as you want. I believe that the platform requires money for independent use. Besides that, I just study from Khan Academy, but the book you mentioned caught my interest. I will probably look into it. If I ever find any good books on that matter, I will make sure to share them with you.
>>6560 >But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. I can give a relatively uncommon advice here: DL is more about intuition + engineering than theory anyway. Just hack things until they work, and feel good about it. Understanding will come later. Install pytorch and play with tensor api, go through their basic tutorial https://pytorch.org/tutorials/beginner/nn_tutorial.html while hacking on it and trying to understand as much as possible. Develop a hoarder mentality: filter r/MachineLearning and github.com/trending/python for cool repos and models, try to clone & run them, fix and build around. This should be a self-reinforcing activity, you should not have other dopaminergic timesinks because you will go the way of least resistance then. Read cool people's repos to get a feeling of the trade: https://github.com/karpathy https://github.com/lucidrains Read blogs http://karpathy.github.io/ https://lilianweng.github.io/posts/2018-06-24-attention/ https://evjang.com/2021/10/23/generalization.html https://www.gwern.net/Scaling-hypothesis https://twitter.com/ak92501 When you feel more confident, you can start delving into papers on your own, use https://arxiv-sanity-lite.com/ https://www.semanticscholar.org/ https://inciteful.xyz/ to travel across the citation graph. git gud.
>>16460 >If it wasn't for you and the great contributors of the board, I would not have a place to post that so I thank you! Glad to be of assistance Anon. >And the library thread was really necessary, I wish that the board had a better search function as well. Agreed. >I was trying to find some specific posts and it took me a long while to remember which threads they were on. You know Beginner-kun, if you can build programs from source, then you might look into Waifusearch. We put it together to deal with this conundrum. Doesn't do anything complex yet (Boolean OR), but it's fairly quick at finding related posts for a simple term. For example to lookup 'the Pile', pic related is the result for /robowaifu/ : > >-Latest version of Waifusearch v0.2a >(>>8678) >My University provided me with a platform full of questions >It changes the variables randomly and gives you space to solve as much as you want.

Message too long. Click here to view full text.

>>16464 Thanks for all the great links & advice Anon, appreciated.

Open file (485.35 KB 1053x1400 0705060114258_01_Joybot.jpg)
Robowaifu Simulator Robowaifu Technician 09/12/2019 (Thu) 03:07:13 No.155 [Reply] [Last]
What would be a good RW simulator. I guess I'd like to start with some type of PCG solution that just builds environments to start with and build from there up to characters.

It would be nice if the system wasn't just pre-canned, hard-coded assets and behaviors but was instead a true simulator system. EG, write robotics control software code that can actually calculate mechanics, kinematics, collisions, etc., and have that work correctly inside the basic simulation framework first with an eye to eventually integrating it into IRL Robowaifu mechatronic systems with little modifications. Sort of like the OpenAI Gym concept but for waifubots.
https ://gym.openai.com/
124 posts and 60 images omitted.
Open file (100.78 KB 1200x675 fb_habitat.jpg)
>>16415 Mujoco is state of the art in ~real-time jointed rigid body physics simulation, nice taste, anon. Still, it's not a complete environmental simulator, it is very useful for limited domain hard-dynamics manipulation and movement experiments. >155 I think FAIR's habitat simulation environment[1][2] is the most sensible choice for our needs. It's a complete system with physics, robot models, rendering stack and ML integrations. It would be of major help to the project if we developed a waifu-specific randomized (to facilitate sim2real generalization) sim-environment, and collected enough behavioral data traces to pinpoint the necessary behavioral patterns, similar to deepmind's recent imitation learning (IL) tour de force: https://www.deepmind.com/publications/creating-interactive-agents-with-imitation-learning If you choose this tool, feel free to ask for help if it somehow breaks. 1. https://github.com/facebookresearch/habitat-lab 2. https://github.com/facebookresearch/habitat-sim
>>16446 Thanks Anon! We'll have a look into it. >that ambient occlusion Nice. Cheers.
>>16446 So far, assimp is breaking the build. After recursive checkout, merging git submodule foreach git merge origin master led to these errors during that process CONFLICT (content): Merge conflict in code/Common/Version.cpp ... CONFLICT (modify/delete): assimpTargets-release.cmake.in deleted in origin and modified in HEAD. Version HEAD of assimpTargets-release.cmake.in left in tree. Pressing on with abandon, I did get a little way before it failed with: FAILED: deps/assimp/code/CMakeFiles/assimp.dir/Common/Version.cpp.o I wanted to give it a shot at least once, but ATM I can't afford the time required to dig in and try to fix such a complex system's build from source. But thanks Anon! It certainly looks interesting and I'm pleased to see a Globohomo behemoth such as F*cebook put out something this big with an MIT license. >===

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/25/2022 (Wed) 09:37:45.
>>16453 I managed to build and run it on debian sid with a script inspired by this document https://github.com/facebookresearch/habitat-sim/blob/main/BUILD_FROM_SOURCE.md Basically you clone the repo, checkout to the latest stable tag, update submodules recursively via the usual command git submodule update --init --recursive. I had to comment out a section in setup.py that deals with cmake path to make it work, like this: #try: # import cmake # # If the cmake python package is installed, use that exe # CMAKE_BIN_DIR = cmake.CMAKE_BIN_DIR #except ImportError: CMAKE_BIN_DIR = "" Ensure you have the cmake, debian packages and python libraries they require, then do the python3 setup.py install --bullet. It should build a several hundered source files via cmake and install the package.

Message too long. Click here to view full text.

>>16462 Thanks! I appreciate you're avoided conda for this. I prefer to stick closer to the hardware when feasible. I'll give your instructions a shot at some point. I'm going to have to set up a dedicated machine at some point (hopefully soon). >nix/guix or a container/vm Do you have any preferences? I'm certainly averse to anything proprietary tbh. >palace Fancy rooms for fancy robowaifus! :^) <those portraits are fabulous

Humanoid Robot Projects Videos Robowaifu Technician 09/18/2019 (Wed) 04:02:08 No.374 [Reply] [Last]
I'd like to have a place to accumulate video links to various humanoid – particularly gynoid – robotics projects are out there. Whether they are commercial scale or small scale projects, if they involve humanoid robots post them here. Bonus points if it's the work of a lone genius. I'll start, Ricky Ma of Hong Kong created a stir by creating a gynoid that resembled Scarlett Johansson. It's an ongoing project he calls an art project. I think it's pretty impressive even if it can't walk yet. https://www.invidio.us/watch?v=ZoSfq-jHSWw === Instructions on how to use yt-dlp to save videos ITT to your computer: (>>16357)
Edited last time by Chobitsu on 05/21/2022 (Sat) 14:20:15.
75 posts and 19 images omitted.
Open file (858.68 KB 1460x686 ClipboardImage.png)
Thoroughly in depth video on the history of MaSiRo that clearly shows exactly how she works. https://www.youtube.com/watch?v=7C33JQVZBuY
>>16350 Thanks Kywy! She's actually quite impressive seeing her in action. I just wish there was a translation of the text, I don't read Moon yet :^). === BTW, it occurs to me that some newer anons here may not know how to download robowaifu videos ahead of time just in case TPTB decide they need to be memory-holed. Better safe than sorry right, Anon? :^) >Warning: These following instructions are only for you if you are comfortable using the Linux terminal, etc. So (running on a *NIX, obvs.) run these commands to install yt-dlp if you haven't yet: sudo curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp -o /usr/local/bin/yt-dlp sudo chmod a+rx /usr/local/bin/yt-dlp You can confirm it's installed by asking it's version afterwards: yt-dlp -v It's also a good idea to: a) download subtitles

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/22/2022 (Sun) 04:42:02.
>>14578 >>14624 Thanks for the videos Anon. The Byrun 'walking' video is rather interesting to me personally. At least Engineered Arts is actually addressing some just some, mind you of the complexities involved this miraculous'simple' act of kinematics. They still don't seem to get the importance of keeping mass and slung/thrown weight down. How are they going to power this thing and for how long, lol? But they are obvs going for the military market with their operation (kek at the Terminator head posed on his table). Shame tbh. Still, there is something to be learned by studying their work, we'll just do it smarter & better. :^)
>>16357 Thanks for this helpful tool! >>16373 From what I can tell, that robot is designed to be plugged in 24/7. (Judging from its heavy duty frame.) I completely ovelooked it at first due to the obvious mass issues but, that knee mechanism is genuinely worth interesting. >We'll just do it smarter & better. :^) You know it Chobitsu! :^)
>>16375 >Thanks for this helpful tool! Y/w. Tbh, we have lots of video links here on our board. Someday, we might think about adding some kind of extension for Bumpmaster/Waifusearch to parse & download them all for Anon automatically? We have actually had a few memory-holed on us already, and it would have been nice to have preserved those from back in the day. >You know it Chobitsu! :^) Onward!

Open file (287.33 KB 1080x980 Pandora Project logo.png)
Open file (267.71 KB 515x735 Pandora WIP.png)
Pandora Project Thread AllieDev Board volunteer 03/05/2022 (Sat) 02:25:52 No.15404 [Reply] [Last]
This is the Pandora thread where I hope to start a overall modular robowaifu framework over time. This is an open board project and collaboration is encouraged. In my efforts to engineer a robowaifu, I'd found that going to a more robotic design would be faster and overall easier to engineer and assemble. For a start, I will post the progress of how far I've gotten on modeling Pandora before I begin the process of breaking everything down into printable parts and assembling functional prototypes. Had also changed her drones to have completely different functionality in my redesign, but that's fanfiction and not really relevant to the board. Engineering them is. Resources: Using Servos in English: https://archive.ph/28ehI The parts of a robot (what they don't want you to know /s): https://archive.ph/ddf6T >>EDIT: added resources and post formatting.
Edited last time by AllieDev on 04/26/2022 (Tue) 03:30:32.
83 posts and 33 images omitted.
>>16338 Just a quick reply to let you know I saw your post AllieDev. I'll be back later today and give you a good reply. Cheers.
Open file (1.30 MB 1920x1080 Loading Icon .gif)
>>16358 That sounds good Chobitsu, I look forward to what you have in mind for Pandora.
>>16370 >I look forward to what you have in mind for Pandora. You're the team leader for the project AllieDev. I'll try to help out as I may. So, if you're asking for my advice then I would say basically that: A) We need to stay as nearly compatible with the MaidCom project as is feasible with Pandora's design allows. B) We need to keep everything (designs, mechanical engineering, electrical engineering, sensors & computer hardware, software goals, etc., etc.) as small & simple & cheap as possible during this first prototyping attempt. After all, it's important to make something that just werks first, then we can blue-sky later as we go along, right Anon? That's pretty much it from my current perspective. If we can just follow these two basic goals, then we are very likely to succeed with Pandora right out of the gate. I pray we do so! :^) === As far as specifics relating to our software goals go, we're already working towards general-purpose solutions under the umbrella of the RW Foundations framework, as you're aware. Now the RW Pandora sub-project 'inherits' the vast bulk of her abilities from the RW Arigato's main project. This is the same for all robowaifus that choose to adopt the Foundations in their system's designs. Sumomo & Kotoko for example, also indirectly inherit all their abilities from RW Arigato. Longer term, Arigato has many different capability goals on tap. Here's the current listing of member classes: >arigato.hpp snippet inline auto animeyes() const -> Animeyes& { return animeyes_; } inline auto atashi() const -> Atashi& { return atashi_; } inline auto bones() const -> Bones& { return bones_; }

Message too long. Click here to view full text.

Open file (37.44 KB 239x239 hello world.png)
Open file (4.62 MB 1080x1080 Hello World.gif)
>>16376 >You're the team leader for the project AllieDev. I'll try to help out as I may. So, if you're asking for my advice then I would say basically that: Well its not a team if only one or two people ever get any input. :) >MaidCom project Not interested after they bailed on something I believed in twice that they claimed to make a commitment to Twice. I had a nice long rant but I don't see any point in continuing to respond to an entity who was always curt and dismissive of all my attempts at rapport and just using me for my talents the entire time until I stood up for myself, then they flaked after claiming they were committed. Twice. After all was said and done; they still have not changed. I only wish to look out for newfags and returning anons who are unaware or uninitiated as this is a very unique kind of board and is my job as part of the community. I apologize if I come off as rehashing the past. I just mean to leave a word of caution based on my own experiences with them and learning from my own mistakes. In better news, It's good to see newfags are still continuing to make their own projects and do their own thing regardless. As it is one of the board's biggest strengths. Would really hate to see it flooded with the terminal autism that has rendered most online discussion unusable in recent years. Unlike the inaccurate phrase: "nothing good lasts forever." there is a better and more accurate saying: "No good thing ever dies." Something I believe applies to Robowaifus and Pandora. As If the history of the board has an application to that phrase. >RW Foundations I had kept Foundations and easy access to the libraries in mind when designing the UI. Though you may consider taking your own advice and consolidating several of these libraries as they cover the same things. Dollhouse + Electro, and Hand + Gears (insert do not fist android joke here :)) For Example. >Where is the UI?

Message too long. Click here to view full text.

Edited last time by AllieDev on 05/23/2022 (Mon) 19:26:28.
>>16414 >Not interested As you see fit. I'll still try to help, but it will be practically impossible for me personally not to arrive at nearly-identical solutions to nearly-identical problems we face w/ Pandora. After all, lots of thought is going into that project. >I only wish to look out for newfags and returning anons who are unaware or uninitiated as this is a very unique kind of board and is my job as part of the community. Please put a sock in it. You're not helping our board's community, rather you're harming it. Imagine if I did the same with you? >Would really hate to see it flooded with the terminal autism that has rendered most online discussion unusable in recent years. I'm far too seasoned a shitposter for that sort of Troon-age to easily come off under my helmsmanship here, Anon. I greatly appreciate your concerns over the quality of the board, but just relax would be my suggestion in the matter. >Unlike the inaccurate phrase: "nothing good lasts forever." there is a better and more accurate saying: "No good thing ever dies." Something I believe applies to Robowaifus and Pandora. As If the history of the board has an application to that phrase. Absolutely! I like the way you think Anon. :^) >I had kept Foundations and easy access to the libraries in mind when designing the UI. Sounds good AllieDev, good thinking. As far as consolidations go, we'll always strive to keep a balance between the many competing design pressures. Thanks for the advice though. Really looking forward to your collection of GUI designs Anon! :^)

Message too long. Click here to view full text.


Open file (72.98 KB 1024x683 1646959262963.jpg)
Open file (359.37 KB 1128x777 1646957191624.png)
Kelly Wang/Mayli Thread Brian Holloman 03/11/2022 (Fri) 01:03:55 ID: bdce6c No.140503 [Reply]
>be hapa daughter of Jew Goldman Sachs VP. >get arrested for distributing weed during freshman year >muh daddy gives you a punishment >fuckyoudad.jpg >be a pornwhore for $1500 just to spite muh daddy >facialabuse >muh daddy gets wind of her porn escapades >buys the videos out of shame and DMCAs every single copy on the internet. >gets a thread on foxdickfarms bickers of this. >be associated with the likes of Killstream Гунт and Sonichu sperg. >open up YouTube channel >leave comments open How does spite and cum taste like Mayli?
7 posts and 8 images omitted.
Hmmm...what could the comments be all abput?
Open file (655.95 KB 1151x615 see_orgre.png)
Whoa, real life Zachy Chan.
>>140690 Real life Zachy-chan? Is Zachy-chan just a larp?
>>140554 >200 dollars LMFAO
Sad situation. The chinese and the jews have generally been kind to me on a personal level and screwed me on an institutional level. I know several charming, loving chinese/jewish american guys that honestly wouldn't mind her. Post in this thread if you're interested kelly. We all want love at the end of the day.

Open file (2.07 MB 4032x2268 20220520_071840.jpg)
Ashiel - A Robowaifu Design Project SoaringMoon 05/20/2022 (Fri) 11:22:02 No.16319 [Reply]
< Introduction to This Thread This thread is going to be dedicated to my ongoing robowaifu project. This isn't exactly new, I have mentioned it here before in passing. However, this is the first thread I have opened specific to my robowaifu project and not an artificially intelligent chatbot. This thread will be updated when I have something to say about robowaifu design, or have an update on the project. Most of the content will be of the kind of me proposing an idea or suggestion for developers to make the construction of a robowaifu easier. My design philosophy is one of simplicity and the solving of problems instead of jumping to the most accurate simulacrum of the female human form. Small steps make incremental progress, which is something the community need because little progress is made at all. What progress we do make takes years of work, typically from a single person. Honestly, I'm getting tired of that being the norm in the robowaifu community. I'm frankly just done with that stagnation. Join me on my journey, or get left behind. < About Ashiel ASHIEL is an acronym standing for /A/rtificial /S/hell-/H/oused /I/ntelligence and /E/mulated /L/ogic. Artificial, created by man. Shell-Housed, completely enclosed. Intelligence and Emulated Logic, are both a combination of machine learning-based natural language processing and tree-based lookup techniques. ASHIEL is simply Ashiel in any future context, as that will be her name. Ashiel is an artificially intelligent gynoid intended to specialize in precise movement, and engage in basic conversation. Its conversational awareness would be at least equal to that of Replika, but with no chat filtering and a much larger memory sample size. If you want to know what this feels like, play AIDungeon 2. With tree-based lookup, it should be able to perform any of the basic tasks Siri or Alexa can perform. Looking up definitions to words over the internet, managing a calendar, setting an alarm, playing music on demand... etc. The limitations of the robot are extensive. Example limitations include but are not limited to: the speaker will be located in the head mouth area but will obviously come from an ill-resonating speaker cavity; the mouth will likely not move at all, if so not in any meaningful way; The goals of the project include: basic life utility; accurate contextual movement; the ability to self-clean; ample battery life; charging from a home power supply with no additional modifications; large memory-based storage with the ability to process and train in downtime; and yes, one should be able to fuck it. This is meant to be the first iteration in a series of progressively more involved recreational android projects. It is very unlikely the first iteration will ever be completed of course. Like many before me, I will almost certainly fail. However, I will do what I can to provide as much information as I can so my successors can take up the challenge more knowledgeably. < About Me

Message too long. Click here to view full text.

13 posts and 12 images omitted.
>>16359 LiFePO4 is definitely the best possible option for waifu's. If using that chemistry, it's mostly safe to keep it in the legs. Though, as has been mentioned, othe Li based chemistries would be dangerous in the arms and legs. >Balance As someone who did much research into balance years ago; shifting upper body mass or a low center of mass have many benefits compared to reaction wheels for bipeds. These include, much lower energy consumption with low center of mass designs essentially requiring no energy for balance, though more energy to move. Simpler design, and they're easier to program as reaction wheels are finicky to get right. You could also just give them a tail, the easiest way to attain balance is to increase points of surface contact. A dress with wheels also helps.
>>16404 >Lol. Obvs, dear Sumomo-chan is intended for fairy-bot scale headpats! :^) Well she is the size of a christmas ornament lol. >Right to Repair' I would stay away from any astroturfing buzzwords of the day. That term implies we sell things here and can sue them for fixing or modifying the product they bought.
>>16426 >astroturfing buzzwords No, Right to Repair implies we don't sue them for fixing or modifying. Right to Repair is allowing side, not the denying side. >implies we sell things here Yes, did you think if I'm building an android for private use that you would be getting it free of charge? Advocating for right to repair means that if I do sell it to you that you would have permission to modify the product you own. The product should come with a manual that includes electrical and engineering schematics to better facilitating you do that. Calling it an "astroturfing buzzword" undermines the entire ideology of hacky customization of one's own electronics.
>>16424 One thing I'd like to point out Kywy regarding my regular encouragement to 'centralize' robowaifu's mass around her center of gravity is with longer-term, fully-bipedal-capable robowaifu designs in mind. Basic physics for our more advanced mechanical designs to come, yeah? But please understand I'm not in any way suggesting the approach we're adopting in MaidCom initial prototype is somehow flawed. Indeed I think it's literally the best compromise solution we can manage ATM for something that just werks + provides the basic facilities you've set for the project's goals. >tl;dr Drive on! :^)
>>16426 >Well she is the size of a christmas ornament lol. Heh. Headpat Waifus is the new category I intend to use for the union set of Fairy-bot + Moe-bot robowaifus. :^) (>>13018, R/W form factors related) >I would stay away from any astroturfing buzzwords of the day. Understood. The abuses of Apple and other lawyer-filled mega-Globohomo corporations sparked this debate long ago, it's nothing new. And actually, it's a real & legitimate grassroots resistance to some of the Globohomo's evil shenanigans, AllieDev.

Open file (156.87 KB 1920x1080 waifunetwork.png)
WaifuNetwork - /robowaifu/ GitHub Collaborators/Editors Needed SoaringMoon 05/22/2022 (Sun) 15:47:59 No.16378 [Reply]
This is a rather important project for the people involved here. I just had this amazing idea, which allows us to catalogue and make any of the information here searchable in a unique way. It functions very similarly to an image booru, but for markdown formatted text. It embeds links and the like. We can really make this thing our own, and put the entire board into a format that is indestructible. Anyone want to help build this into something great? I'll be working on this all day if you want to view the progress on GitHub. https://github.com/SoaringMoon/WaifuNetwork
4 posts and 1 image omitted.
Open file (57.45 KB 808x794 root_of_all.png)
>>16382 I'm really interested to see how this will turn out.
Thumbs up for a good project!
>>16380 >>16381 >>16383 Fair enough. I updated the the board's JSON archive just in case you decide to take my advice, Anon: >the archive of /robowaifu/ thread JSONs is available for researchers >latest revision v220523: https://files.catbox.moe/gt5q12.7z As an additional accommodation for your team, I've here included a sorted, post-counted word list of the words contained on /robowaifu/, courtesy of Waifusearch (current as of today's archive). >searching tool (latest version: waifusearch v0.2a >>8678) Hopefully it will be of some use in your project's endeavor. > BTW, the latest version of stand-alone Waifusearch's source JSON archive should stay linked-to in the OP of our Library thread (>>7143). And on that note, would you please consider adding your findings into our library thread? That way anons like me who don't use your project will have some benefit from it's improvements as well. That would be much-appreciated if so, Anon. Cheers.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/23/2022 (Mon) 10:24:13.
>>16403 Thank you very much, that can help find the most mentioned topics. But, I'll only need the one. I'm probably going to parse the json to make this whole process easier.
Open file (143.99 KB 1003x871 graph.png)
>>16403 Yeah that was ludicrously fast. All of the threads are imported now; absolutely painless compared to what I was doing. Now we can start doing the real work.

Open file (659.28 KB 862x859 lime_mit_mug.png)
Open-Source Licenses Comparison Robowaifu Technician 07/24/2020 (Fri) 06:24:05 No.4451 [Reply] [Last]
Hi anons! After looking at the introductory comment in >>2701 which mentions the use of the MIT licence for robowaifu projects. I read the terms: https://opensource.org/licenses/MIT Seems fine to me, however I've also been considering the 3-clause BSD licence: https://opensource.org/licenses/BSD-3-Clause >>4432 The reason I liked this BSD licence is the endorsement by using the creator's name (3rd clause) must be done by asking permission first. I like that term as it allows me to decide if I should endorse a derivative or not. Do you think that's a valid concern? Initially I also thought that BSD has the advantage of forcing to retain the copyright notice, however MIT seems to do that too. It has been mentioned that MIT is already used and planned to be used. How would the these two licences interplay with each other? Can I get a similar term applied from BSD's third clause but with MIT?

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/24/2020 (Fri) 14:07:59.
48 posts and 9 images omitted.
Leaving this here as it has a clear, relatively easy-to-understand comparison of various licenses out there. https://www.openbsd.org/policy.html
I use ISC. It's almost a BSD-2-Clause equivalent. First, the text is more brief. Second, it doesn't specifically say that the license text has to be distributed with both the source and the binary forms of the program, it only says it has to be distributed with "copies". I use it for the brevity. I've noticed some code that gets reimplemented over and over again, often poorly, so I started a project that tries to implement it properly with BSD-0-Clause which is derived from the ISC license but has its only 2 terms removed so it's public domain equivalent except it disclaims all warranties. I did it because I know even being told to copypaste the license text and to include it in releases is too much for some. The reason I use ISC over BSD-0-Clause in other projects is that there's some legal bullshit that says that if your warranty disclaimer isn't visible it doesn't count, and I assume nobody will go after my ass due to warranties, but that's no reason to stay vulnerable, so I use ISC unless I have a reason not to. Ideally, copyright wouldn't exist, but the ISC license is written using language humans can read, and it essentially says "do whatever you want I don't care as long as you copy this license text" which is not perfect but good enough. I'm seeding terabytes of infringed copyright on my torrent client right now. I wish there was a copyleft license that was compatible with 100% of free licenses. The idea behind copyleft is good, but the issue with copyleft licenses is that they end up cucking other FLOSS developers from using your software, and that's unacceptable. Working with every single FLOSS program like the permissive licenses do is a hard requirement for me. But really, if someone copypasted my ISC software and didn't give me credit I wouldn't care, I would even be happy someone is using something I made, I really just want copyright and its supporters to die, to reiterate: singular reason that "copy the loicense pls" tidbit is in the license I use is to cover my ass, but ultimately, I want people to do whatever the heck they want with their copies.
BTW, OpenWRT violates the shit of all the licenses and gets into 0 trouble for it. Go to the OpenWRT repositories right now and download any package, it's just a .tar.gz tarball with a different extesion. Look into the contents of the package: there is no license text anywhere. A lot of FLOSS licenses require that the license text be distributed with binary, but OpenWRT simply doesn't do it. I approve. OpenBSD does almost the same thing, they very rarely distribute the loicense, so they're violating the license of most of their packages with such a term.
>>16352 >I use ISC. This is in essence just what OpenBSD advocates today. https://cvsweb.openbsd.org/src/share/misc/license.template?rev=HEAD

ROBOWAIFU U Robowaifu Technician 09/15/2019 (Sun) 05:52:02 No.235 [Reply] [Last]
In this thread post links to books, videos, MOOCs, tutorials, forums, and general learning resources about creating robots (particularly humanoid robots), writing AI or other robotics related software, design or art software, electronics, makerspace training stuff or just about anything that's specifically an educational resource and also useful for anons learning how to build their own robowaifus. >tl;dr ITT we mek /robowaifu/ school.
Edited last time by Chobitsu on 05/11/2020 (Mon) 21:31:04.
97 posts and 64 images omitted.
The Elements of Computing Systems, second edition: Building a Modern Computer from First Principles Noam Nisan, Shimon Schocken >I Hardware >"The true voyage of discovery consists not of going to new places, but of having a new pair of eyes." >t.Marcel Proust (1871–1922) >This book is a voyage of discovery. You are about to learn three things: how computer systems work, how to break complex problems into manageable modules, and how to build large-scale hardware and software systems. This will be a hands-on journey, as you create a complete and working computer system from the ground up. The lessons you will learn, which are far more important than the computer itself, will be gained as side effects of these constructions. According to the psychologist Carl Rogers, “The only kind of learning which significantly influences behavior is self-discovered or self-appropriated—truth that has been assimilated in experience.” This introduction chapter sketches some of the discoveries, truths, and experiences that lie ahead. 33E8664A26F52769692C070A31A96CCE
>(>>15925 related crosslink, OS design)
>just dropping this here for us, since neither seem to be present ITT yet? https://functionalcs.github.io/curriculum/ https://teachyourselfcs.com/ https://www.mooc.fi/en/ (pozz-warning, but much good stuff as well)
>(>>16124, related crosspost)
Open file (33.22 KB 342x443 1651114786655-0.jpg)
So I think the time has finally come at last. I hope for the fortitude of soul to finally pursue maths. As a repeat High School dropout they kept putting me back in anyway lol, I absolutely loathed (and do even moreso today) the modern public """education""" systems. So I felt at the time my decisions were well-merited. Heh. So, fast-forward to today and I barely know how to add 2+2 haha. :^) Missing out on the basics of algebra, trig, geometry, pre-calc, calc, stats, etc., is proving a big hindrance to my progress today for us with robowaifus. Even though I'm now an adult man, I think I can still pick it up. The challenge is making the time to study on top of my already-overflowing plate, and the AFK pressures of keeping body & soul connected together. >tl;dr I'm starting with Euler's, wish me luck Anons! > >P.S. Feel free to pester me once a year -- say, during Summers -- to know how this little project is going. Sharp pointy-sticks can be a good thing after all. :^)

Open file (213.86 KB 406x532 13213d24132.PNG)
Open file (1.19 MB 1603x1640 non.png)
Robowaifu Technician 10/29/2020 (Thu) 21:56:16 No.6187 [Reply]
https://www.youtube.com/watch?v=SWI5KJvqIfg I have been working on creation of waifus using GANs etc... I've come across this project and I am totally amazed. Anyone has any idea how we can achive this much of a quality animation based GAN created characters? I think accomplishing this kind of work would have a huge impact on our progression. Calling all the people who posted at chatbot thread.
1 post omitted.
Open file (213.22 KB 400x400 sample4.png)
Open file (199.16 KB 400x400 sample1.png)
Open file (194.92 KB 400x400 sample2.png)
Open file (199.43 KB 400x400 sample3.png)
>>6188 Looking at some old tweets from them I think it is safe to say that it doesn't look much different from StyleGan on portraits. The shoulders are bad, & most of the work is done by their data cleaning to simplify the problem. Interps & style mixing are nothing special either. Gwerns work with some whack data was able to create similar kind of characters. Also waifulabs - which is all run by StyleGan - can create some really quality characters from different positions. And notice that they are a game development studio which does not work on AI waifu creations. Looks like hype-bait to me to be honest. They probably cherrypicked some of the results and maybe even manually played it to create this kind of animations. And considering their budget and data that is well possible. I am not sure if they still use StyleGAN though. They do not drop even a clue. But honestly with the current state of it and the time they spent on it I think they use a different approach.
My chief concern is first and foremost Is this open-source? If not, then it's relatively useless to us here on /robowaifu/, other than tangentially as inspiration. Hand-drawn, meticulously-crated animu is far better in that role tbh.
>>6187 It appears the characters are generated with a GAN then another model separates the character pieces into textures for a Live2D model. They're not animated with AI, but there are techniques to do such a thing: https://www.youtube.com/watch?v=p1b5aiTrGzY
Video on the state of anime GANs, anime created by AI, including animation for vtuber/avatar style animations: https://youtu.be/DX1lUelmyUo One of the guys mentioned in the video, creating a 3D model from a drawing, around 10:45 in the video above: https://github.com/t-takasaka - didn't really find which one it is on his Github yet, though. He seems to have some pose estimation to avatar in his repository, though. Other examples in the video might be more interesting for guys trying to build a virtual waifu. "Talking Head Anime 2", based on one picture: https://youtu.be/m13MLXNwdfY
>>16245 This would be tremendously helpful to us if we can find a straightforward way to accomplish this kind of thing in our robowaifu's onboard systems Anon ('character' recognition, situational awareness, hazard avoidance, etc.) Thanks! :^)

Open file (17.64 MB 905x690 welcome_to_cow.gif)
Welcome to /cow/ Toad 04/25/2021 (Sun) 07:01:20 ID: 2aaf77 No.101360 [Reply] [Last]
Welcome to /cow/ /cow/ is the finest corner on the underbelly of the internet to discuss lolcows, or retarded individuals chuckles that can be milked for amusement. Rules 1. New threads must have a subject and pertain to established or potential lolcows. Off-topic discussion is permitted on Fridays in >>40288 2. Don't break US laws (SSNs, CP, etc.) Board Information 1. SpacePirate closed n0chan because he was tired of paying for it. NekoArc closed 888chan because he was tired of paying for it. 8chan died as a result of the (((El Paso shooting))), Julayworld was reorganized into Alogspace now Alogsspace because Robi no longer wanted to manage as a large a website. 2. I am only managing the board in JEWS absence, in his own words "I will return when the lolcows return," should he return he shall get full control, I am not the new BO. 3. #julayworld and #/cow/ on rizon are useful places to get into contact with the sites managers https://wiki.rizon.net/index.php?title=Register_your_nickname https://wiki.rizon.net/index.php?title=VHost 4. https://alogs.space/ is the main domain name, https://julay.world also redirects there
Edited last time by oldestfag on 04/05/2022 (Tue) 07:08:41.
99 posts and 19 images omitted.
>>149954 >>149961 I can ensure you that >we haven't been kiked, I'll ask robi to update it when he can. t. janny

Open file (265.99 KB 1902x2476 pre-goolag YouTube.png)
Pre-Goolag YouTube Fellow Time Traveler 10/03/2020 (Sat) 17:51:14 No.848 [Reply]
YouTube was once also part of the old innocent, creative and fun internet when at worst people would make a video in the hopes of it getting viral. But that slowly changed thanks to the Goolag purchase which kept ruining YouTube at small steps and turning it into Cable TV 2.0 and Spotify 2.0 : >forcing people to move to Goolag account to keep their channels and their videos >complying with copyrightniggers as part of the mutual ass-kissing with the government >giving monetary incentives to "content creators" , aka self-important e-celeb faggots, which drawn greedy normalfag scum and later on were removed which in turn force these greedy normalfags to beg for Patreon gibs and get sponsorships to advertise even more shit on youtube >adding ads in the middle of the fucking video as if it was cable tv >letting big cable tv channels having accounts on youtube >removing full albums and songs from non-corporate music channels and reupload them as separate songs by a fucking bots >increasing censorship and removing comments and videos >disorganizing the comments section into a complete messy shitshow >usless redesigning of the site over and over again >changing the 5 star video valuation system to extra faggy like and dislike Let's reminisce of YouTube better days, whether it was a specific channel or video and so on...
25 posts and 4 images omitted.
Open file (87.93 KB 1280x720 download (42).jpg)
BANISHED FROM EARTH CLASSIC GAME ROOM BROADCASTS FROM THE INTERGALACTIC SPACE ARCADE ON ITS NEVER-ENDING MISSION TO REVIEW... EVERYTHING!
>>850 The Bugs Bunny Birthday episode I believe was one of the kinos that got me hooked to his content way back when. The ones where he was talking about the different game magazines or comparing the Intellivison & Colecovision are also memorable to me.
Allemaal anus palen NU!
>>988 It sucks what happened with him.
>>848 fuck niggers

Open file (1.70 MB 1280x712 1626774178655.png)
Lupo Lucio 07/20/2021 (Tue) 13:36:36 No.17081 [Reply] [Last]
Perché non credete nella scienza, oh Lupi? Sembra esistere una forte correlazione tra essere di destra e anti-scienza. Non finirete bene.
129 posts and 73 images omitted.
Open file (387.06 KB 954x659 Immagine.png)
>>19007 Bonus
Open file (357.47 KB 1311x593 Immagine.png)
Raga situazione gravissima, qua se non attivano subito quarta, quinta e sesta dose vado in crisi d'astinenza MA DI BRUTTO BRUTTO BRUTTO! La Scienza nella persona dell'assolutamente privo di conflitti di interessi Ricciardi ha parlato. https://archive.ph/XxOJW
Open file (8.04 MB 320x240 "esperti".mp4)
>esperti Pensa essere eternamente immortalato come il coglione di turno "muh sono un esperto". Testa è quello che è, però le persone tipo Tozzi sono dei negri a quoziente intelletivo temperatura ambiente.
Open file (3.80 MB 600x338 pfucked.webm)
Abbiamo un nuovo neologismo?
Open file (185.00 KB 500x522 1646083825703.png)

Open file (185.64 KB 1317x493 NS-VQA on CLEVR.png)
Open file (102.53 KB 1065x470 NS-VQA accuracy.png)
Open file (86.77 KB 498x401 NS-VQA efficiency.png)
Neurosymbolic AI Robowaifu Technician 05/11/2022 (Wed) 07:20:50 No.16217 [Reply]
I stumbled upon a couple videos critiquing "Deep Learning" as inefficient, fragile, opaque, and narrow [1]. It claims Deep Learning requires too much data, yet it performs poorly trying to extrapolate from training set, and how it arrives to its conclusions are opaque, so it's not immediately obvious why it breaks in certain cases, and all that learned information cannot be transfered between domains easily. They then put forth "Neurosymbolic AI" as the solution to DL's ails and next step of AI, along with NS-VQA as an impressive example at the end [2]. What does /robowaifu/ think about Neurosymbolic AI (NeSy)? NeSy is any approach that combines neural networks with symbolic AI techniques to take advantage of both their strengths. One example is the Neuro-Symbolic Dynamic Reasoning (NS-DR) applied on the CLEVRER dataset [3], which cascades information from neural networks into a symbolic executor. Another example is for symbolic mathematics [4], which "significantly outperforms Wolfram Mathematica" in speed and accuracy. The promise or goal is that NeSy will bring about several benefits: 1. Out-of-distribution generalization 2. Interpretability 3. Reduced size of training data 4. Transferability 5. Reasoning I brought it up because points 3 and 5, and to a lesser degree 4, are very relevant for the purpose of making a robot waifu's AI. Do you believe these promises are real? Or do you think it's an over-hyped meme some academics made to distract us from Deep Learning? I'm split between believing these promises are real and this being academics trying to make "Neurosymbolic AI" a new buzzword. [5] tries to put forth a taxonomy of NeSy AIs. It labels [4] as an example of NeSy since it parses math expressions into symbolic trees, but [4] refers to itself as Deep Learning, not neurosymbolic or even symbolic. Ditto with AlphaGo and self-driving car AI. And the NS-DR example was beaten by DeepMind's end-to-end neural network Aloe [6], and overwhelmingly so when answering CLEVRER's counterfactuals. A study reviewed how well NeSy implementations met their goals based on their paper, but their answer was inconclusive [7]. It's also annoying looking for articles on this topic because there's like five ways to write the term (Neurosymbolic, Neuro Symbolic, Neuro-Symbolic, Neural Symbolic, Neural-Symbolic). >References [1] MIT 6.S191 (2020): Neurosymbolic AI. <https://www.youtube.com/watch?v=4PuuziOgSU4> [2] Neural-Symbolic VQA: Disentangling Reasoning from Vision and Language Understanding. <http://nsvqa.csail.mit.edu/>

Message too long. Click here to view full text.

Open file (422.58 KB 1344x496 MDETR.png)
Open file (68.28 KB 712x440 the matrix has you.jpg)
>>16217 I think such critique is outdated. The impressive results of NS-VQA have been beaten by full deep learning approaches like MDETR.[1] It would be a bit ironic to call deep learning fragile and narrow and then proceed to write specific functions that only handle a certain type of data that the training set just happens to be a small subset of and call it generalization. Sure it can handle 'out-of-distribution' examples with respect to the training set, but give it a truly out-of-distribution dataset with respect to the functions and these handwritten methods will fail completely. A lot of deep learning approaches these days can learn entire new classes of data from as few as 10-100 examples. ADAPET[2] learns difficult language understanding tasks from only 32 examples. RCE[3] can learn from a single success state example of a finished task. DINO[4] can learn to identify objects from no labelled examples at all. CLIP[5] and CoCa[6] are examples of deep learning generalizing to datasets they were never trained on, including adversarial datasets, and outperforming specialized models, and this is just stuff off the top of my head. Someone ought to give DALL-E 2 the prompt "a school bus that is an ostrich" and put that meme to rest. That said, neurosymbolic AI has its place though and I've been using it lately to solve problems that aren't easily solvable with deep learning alone. There are times when using a discrete algorithm saves development time or outperforms existing deep learning approaches. I don't really think of what I'm doing as neurosymbolic AI either. Stepping away from matrix multiplications for a bit doesn't suddenly solve all your problems and become something entirely different from deep learning. You have to be really careful actually because often a simpler deep learning approach will outperform a more clever seeming neurosymbolic one, which is clearly evident in the progression of AlphaGo to AlphaZero to MuZero. From my experience it hasn't really delivered much on the promises you listed, except maybe point 2 and 5. I wouldn't think of it as something good or bad though. It's just another tool and it's what you do with that tool what counts. There was a good paper on how to do supervised training on classical algorithms. Basically you can teach a neural network to do a lot of what symbolic AI can do, even complicated algorithms like 3D rendering, finding the shortest path or a sorting algorithm. I think it shows we've barely scratched the surface of what neural networks are capable of doing. https://www.youtube.com/watch?v=01ENzpkjOCE https://arxiv.org/pdf/2110.05651.pdf >Links 1. https://arxiv.org/abs/2104.12763 2. https://arxiv.org/abs/2103.11955 3. https://arxiv.org/abs/2103.12656 4. https://arxiv.org/abs/2104.14294 5. https://arxiv.org/abs/2103.00020

Message too long. Click here to view full text.

Open file (201.23 KB 1133x1700 spaghetti_mama.jpg)
Idling around the Interwebz today[a], I found myself reading the Chinese Room Argument article on the IEP[b], I came across the editor's contention in the article the notion that "mind is everywhere" is an "absurd consequence". >"Searle also insists the systems reply would have the absurd consequence that “mind is everywhere.” For instance, “there is a level of description at which my stomach does information processing” there being “nothing to prevent [describers] from treating the input and output of my digestive organs as information if they so desire.” "[1],[2] I found that supposed-refutation of this concept vaguely humorous on a personal level. As a devout Christian Believer, I would very strongly assert that indeed, Mind is everywhere. Always has been, always will be. To wit: The Holy Spirit sees and knows everything, everywhere. As King David wrote: >7 Where can I go to escape Your Spirit? > Where can I flee from Your presence? >8 If I ascend to the heavens, You are there; > if I make my bed in Sheol, You are there. >9 If I rise on the wings of the dawn, > if I settle by the farthest sea, >10 even there Your hand will guide me; > Your right hand will hold me fast.[3] However, I definitely agree with the authors in their writing that >"it's just ridiculous" to assert >" “that while [the] person doesn’t understand Chinese, somehow the conjunction of that person and bits of paper might” ".[1],[2]

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/16/2022 (Mon) 00:13:09.

Reploid thread Robowaifu Technician 02/28/2022 (Mon) 04:13:32 No.15349 [Reply]
A few people thought it'd be a good idea to start a thread for Reploid builds, so here we are! To kick things off, here's a little progress on painting my own RiCO. It's just spray paint so it doesn't look stellar and I screwed up a couple parts. All the blue trim paint needs done as well. I don't care if it's perfect, I just want to get her done. Not to mention anything adjacent to "art" or "craftsmanship" is beyond me, mostly due to lack of patience: I don't want to spend 1000s of hours grinding away with a paintbrush when I could be designing up cool (to me...) robotic mechanisms for instance. I bet you bottom dollar none of my projects will be winning awards in the fit-and-finish department. Can't wait to see what happens with Pandora and whatever other Reploid projects people might be working on.
36 posts and 28 images omitted.
>>16021 >Great idea, mind if I borrow this technique for MaidCom? Please do. Check out how I did the rest of her eyes as well, perhaps it could scale up.
>>16029 It does scale up really well. Though I will innovate upon it for Pandora.
Edited last time by AllieDev on 04/27/2022 (Wed) 01:32:08.
>>15999 I'm really impressed with this. I have a lot of questions, I'll scan the rest of this thread to make sure they aren't already answered first.
Curious if you had any changes this week with your wonderful robowaifu, RiCOdev?
>>16243 RiCO will be on hold for the near future; your friendly neighborhood reploid builder has got bigger fish to fry. I'll still keep an eye out here to answer questions etc.

Visual Waifus Robowaifu Technician 09/15/2019 (Sun) 06:40:42 No.240 [Reply] [Last]
Thoughts on waifus which remain 2D but have their own dedicated hardware. This is more on the artistry side though ai is still involved. An example of an actual waifu product being the Gatebox.
gatebox.ai/sp/

My favorite example is Ritsu, she's a cute ai from assassination classroom who's body is a giant screen on wheels.
188 posts and 92 images omitted.
>>15954 Part of the program would need to respond to the touch event immediately, such as if you stroke a waifu's hair it should move right away. The language model would also take into account this touch event to produce a sensible response instead of making responses that are oblivious to them. It could also generate more complex animation instructions in the first tokens of a response, which would have about a 250ms delay similar to human reaction time. It's not really desirable to have a set of pre-made animations that the waifu is stuck to since after seeing them over and over again the waifu will feel rigid and stuck to replaying them. With the language model though you could generate all kinds of different reactions that take into account the conversation and touch events.
>>15956 OK, I'll take your word for it Anon. I'm sure I'll understand as we work through the algorithms themselves, even if the abstract isn't perfectly clear to me yet. You can be sure I'm very attuned to the needs of efficient processing and timely responses though! Lead on! :^) >>15953 BTW, thanks for taking the trouble of posting this Anon. Glad to see what these game manufacturers are up to. Nihongo culturalisms are pretty impactful to our goals here on /robowaifu/ tbh. Frankly they are well ahead of us for waifu aesthetics in most ways. Time to catch up! :^)
>>15953 That Madoka is the epitome of cuteness. If only there were a way to capture that voice and personality and translate it into English. >>15956 Timing of reactivity is important for preventing the uncanny valley from a communications standpoint. For her animations, it may be effective to have several possible animations for various responses that are chosen at random, though never repeating. Like, having a "welcome home" flag that triggers an associated animation when she's saying "welcome home".
>>15967 >Timing of reactivity is important for preventing the uncanny valley from a communications standpoint. You know I just had a thought at reading this sentence Kywy. It's important to 'begin' (as in, within say, 10ms) a motion, even though it isn't even her final form yet. :^) What I mean is that as soon as a responsive motion need is detected, then her servos should begin the process immediately, in a micro way, even if the full motion output hasn't been decided upon fully yet. IMO this sort of immediacy to response is a subtle clue to 'being alive' that will subconsciously be picked up on by Anon. As you suggest, without it, a rapid cascade into the Uncanny Valley is likely to ensue. It's not the only approach that's needed to help solve that issue, but it's likely to be a very important aspect of it. Just a flash insight idea.
I recognize that b/c of 'muh biomimicry' autism I have, I'm fairly inclined to go overboard into hyperrealism for robowaifus/visualwaifus, even though I know better. So my question is >"Are there simple-to-follow guidelines to keep from creating butt-fugly uncanny horrors, but instead create cute & charming aesthetics in the quest for great waifus?" Picrel is from the /valis/ thread that brought this back up to my mind. > https://anon.cafe/valis/res/2517.html#2517

Waifu Robotics Project Dump Robowaifu Technician 09/18/2019 (Wed) 03:45:02 No.366 [Reply] [Last]
Edited last time by rw_bumpbot on 05/25/2020 (Mon) 04:54:42.
240 posts and 174 images omitted.
>>15733 >link Google is your friend: https://sweetiebot.net/ From what I understand they want to keep things rated PG. The voice generator (community talknet project) is unrelated and based in the /ppp/ thread on 4chan.org/mlp/. Enter if you dare ^:)
Open file (117.25 KB 640x798 cybergirl.jpg)
>>15731 >ponies I was thinking more >>15733 or picrel, but a cute robot horse has the PR advantage because it could easily be a children's toy.
>>15731 I think some of the ponys mentioned this project to us before Anon, thanks. I wish those Russians good success! >>15733 Heh, /monster/ pls. :^) A hexapod waifu is actually a really good idea for a special-service meido 'waifu' IMO. Just my own tastes in the matter subjectively. But objectively, a hexapod locomotion base (especially combined with roller 'feet' is a superior stability platform from which to do housework & other work. No question. >>15736 Yep. I immediately came to a similar conclusion. But it's obvious they are going for police force service with the push for that bot, and the price tag shows it. Shame, tbh.
are you people serious? all the videos are hosted offsite what the hell am I going to do with a filename put it in yandex? fucking yahoo.jp??? tor search??? why do this?
>>16233 Heh, sorry about that Anon. You're just dealing with the missing information from when our first site was destroyed on us. Those were part of the recovery effort. Unfortunately our copies of the files were lost in the attack. Maybe someday some Anon might restore them here for us. Again apologies, you might see similar in other threads here too. But at least we still have a board! :^)

Electronics General Robowaifu Technician 09/11/2019 (Wed) 01:09:50 No.95 [Reply] [Last]
Electronics & Circuits Resources general

You can't build a robot w/o good electronics. Post good info about learning, building & using electronics.

www.allaboutcircuits.com/education/
71 posts and 19 images omitted.
>>14824 OK, I think this is a reasonably good thread (unless you have a better one in mind?) Thanks again, Anon.
>>14824 Interesting. What does this imply? Is there a significant improvement in the performance of the chips using this style of transistor? If so, then we'll have to make our own asic using that same method
Open file (79.85 KB 600x424 BPv4-f.jpg)
>>734 For any Anons currently working on electronics boards that would benefit from Bus Pirate, there is also a v4 that has more RAM available. > http://dangerousprototypes.com/docs/Bus_Pirate_v4_vs_v3_comparison The firmware code is also available. https://github.com/BusPirate/Bus_Pirate
Open file (427.08 KB 1500x1500 super start kit.jpg)
Open file (127.68 KB 1649x795 rip mr servo.png)
I'm looking to get into electronics. Are the ELEGOO UNO starter kits any good? There's one on Amazon for $40. I basically just want to learn how to program a microcontroller, control servos with a controller and understand enough so I can start building a robowaifu. Or should I save my money and just play with the circuit simulator in TinkerCAD?
>>16224 I actually have the kit on the left, and I definitely recommend them for learning Anon, sure.

Open file (1.04 MB 2999x1298 main-results-large.png)
Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents Robowaifu Technician 01/21/2022 (Fri) 15:35:19 No.15047 [Reply]
https://wenlong.page/language-planner/ https://github.com/huangwl18/language-planner Can world knowledge learned by large language models (LLMs) be used to act in interactive environments? In this paper, we investigate the possibility of grounding high-level tasks, expressed in natural language (e.g. "make breakfast"), to a chosen set of actionable steps (e.g. "open fridge"). While prior work focused on learning from explicit step-by-step examples of how to act, we surprisingly find that if pre-trained LMs are large enough and prompted appropriately, they can effectively decompose high-level tasks into low-level plans without any further training. However, the plans produced naively by LLMs often cannot map precisely to admissible actions. We propose a procedure that conditions on existing demonstrations and semantically translates the plans to admissible actions. Our evaluation in the recent VirtualHome environment shows that the resulting method substantially improves executability over the LLM baseline. The conducted human evaluation reveals a trade-off between executability and correctness but shows a promising sign towards extracting actionable knowledge from language models. I think it's worth a whole thread. If not, move it to the appropriate section.
Update in this topic, this time by DeepMind. https://www.youtube.com/watch?v=L9kA8nSJdYw Slowly, they realize what we all think about, yet only in virtual spaces.
Open file (251.80 KB 1665x621 calm outline.png)
Open file (172.15 KB 859x785 calm planning.png)
Open file (85.32 KB 827x627 calm success.png)
Open file (337.52 KB 1523x725 coca.png)
Open file (119.53 KB 901x670 coca performance.png)
>>16197 >Human: show me how you polish the baluster >Robowaifu: say no more goshujin-sama There was a really interesting paper recently on context-aware language models that can do planning and achieve better than human performance on a flight-booking task, using a model the same size as GPT-2 small: https://sea-snell.github.io/CALM_LM_site/ It shows a lot of promise for using a Monte Carlo tree search for doing planning with language models, since it only takes 5 dialog generation attempts with the new method to outdo human performance without doing a tree search at all. Also a huge breakthrough in zero-shot multi-modal learning has been made that completely blows CLIP and SotA specialized models to pieces by using a simple to understand contrastive and captioning loss (CoCa) that can leverage existing models: https://arxiv.org/pdf/2205.01917.pdf This is going to be huge for embodied agents. It's a lot like the similarity measure used between sentence embeddings produced by the RoBERTa model in OP's paper to ensure the translated generated instructions are executable, except it does it between images and sentences. And there's another paper worth mentioning doing transfer learning from a language model trained on Wikipedia to an RL agent (on continuous control and games) that outperforms training from scratch: https://arxiv.org/pdf/2201.12122.pdf It seems we're headed towards a leap forward soon with goal-oriented embodied agents using language models.

Artificial Wombs general Robowaifu Technician 09/12/2019 (Thu) 03:11:54 No.157 [Reply] [Last]
I think we need to figure out how to fit a womb onto a waifubot. Where's the fun in having sex if you can't procreate?

Repost from a thread on /b/;
>"If you're like me and want to fuck a qt battlebot and get her pregnant, the best place to put an artificial womb is where a normal womb would be on a normal girl. The metal exterior could simply be a bunch of metal plates that unfold to allow space for the womb pod inside. The baby is probably safest inside the battlebot, and if she has good calibration then there shouldn't be problems with her falling and hurting the baby. After giving birth the metal plates could automatically fold back up again, shrinking the womb pod inside so she is combat effective again."

Well /robowaifu/? Feasible?
124 posts and 14 images omitted.
Open file (274.25 KB 650x1000 SplashTittyMonster.jpg)
Open file (1.40 MB 2189x1305 1586409061467.png)
Before making artificial wombs, how about focusing on how to make artificial functional mammary glands? If I did clone myself then I'd have something to feed him with, and if I didn't at least I could suck them myself. It also seems like a significantly lower bar in terms of complexity compared to an entire functioning uterus.
>>13154 Because the wombs aren't for being put into the robowaifus but for having children. Glands aren't necessary. However, it seem to be possible to create genetically modified yeast that produces all kinds of milk, so it's even not a problem. Don't know about the current state, here two vids I didn't watch mysef yet, the first short the other really long: https://youtu.be/CXYg-qt4OCc https://youtu.be/ZiWnygcYsiQ I thought these were available for years, but somehow didn't here more about it. So I'm curious myself.
>>13189 >Because the wombs aren't for being put into the robowaifus but for having children. I've argued with some weirdos who were very insistent that the wombs should be in the robowaifus, just because it's their fetish. I'm not saying mammary glands need to be in the waifubot, but if they fit, and you're into that, I don't see why not. Either way they'd be useful for raising the kid. >Glands aren't necessary. I'm going to have to disagree with you on that. I don't know a lot about fetal development, but I do know varying blood hormone levels can have significant effects on fetal development. And hormones largely come from and are regulated by glands. When it comes to artificial organs it seems like glands are largely overlooked. As far as I know there's something of an artificial ovary and an artificial thymus, something that literally disappears with age, but other artificial endocrine organs basically don't exist. I think that in order to create a perfectly healthy baby with an artificial womb, you'd either need to replicate almost all of the other organs in the body to digest food/supplements to make blood or have a regular supply of healthy pregnant woman blood. The simple fact that blood plasma donations are still a thing highlights the fact that we still can't create an adequate blood plasma substitute. All an artificial mammary gland would do is make realistic breast milk, which is a simple task compared to making an artificial womb, but is still a difficult task on its own and there may be some problems it solves that would be needed to solve for the artificial wombs.
Alright folks, there's a lot of good information being shared about what scientists are doing with this stuff. Here's a news flash though: Anyone can follow the scientific method. We can break this whole thing down into smaller parts and grassroots this shit. >ctrl-f 'amniotic fluid' >no results What does a human body do? Eat. What does a pregnant human body do? Eat. What does a human fetus do? Absorb nutrients from what the mother eats. Chemically speaking, literally everything that the human body is capable of is available in some form or another at your local grocery store. My question is how difficult would it be to mix together a facsimile for amniotic fluid? What makes it up? How much of what, and how does that change during gestation? We need recipe cards or something.

Open file (173.41 KB 1080x1349 Alexandra Maslak.jpg)
Roastie Fear 2: Electric Boogaloo Robowaifu Technician 10/03/2019 (Thu) 07:25:28 No.1061 [Reply] [Last]
Your project is really disgusting >=== Notice #2: It's been a while since I locked this thread, and hopefully the evil and hateful spirit that was dominating some anons on the board has gone elsewhere. Accordingly I'll go ahead and unlock the thread again provisionally as per the conditions here: >>12557 Let's keep it rational OK? We're men here, not mere animals. Everyone stay focused on the prize we're all going for, and let's not get distracted. This thread has plenty of good advice in it. Mine those gems, and discard the rest. Notice:

Message too long. Click here to view full text.

Edited last time by Chobitsu on 09/02/2021 (Thu) 18:36:20.
118 posts and 37 images omitted.
>>13408 You aren't even taking into consideration financing options. I bet there are places that will do monthly payments, including insurance.
Open file (639.48 KB 1374x690 1525099257922.png)
Why do they do this? How different would it be if a robot wife was what they were looking at? >--- SFW board Anon, pls keep it spoilered here. Thanks!
Edited last time by Chobitsu on 04/18/2022 (Mon) 05:40:31.
>>15920 Not too much different. I've mentioned the idea of artificial wombs to some women (in person) to bypass the abortion debate. Some of the most timid girls I know went into a flying rage at the thought of their potential replacement. I personally think the roastie would be willing to commit violence at any prospect of them losing their current power hold. She will vote for any and all laws banning such. Thus, it may be best to avoid mentioning it to them, and just ignoring them and only sharing amongst men.
>>15920 >Why do they do this? b/c theyre women >How different would it be if a robot wife was what they were looking at? they would actually be angrier. remember that women are operating on instinct and feels. Sure, some may actually do an internal logic/fact check , but those are the unicorns, most will operate on their feeling to something, and justify it with whatever post-hoc. Example: robotic replacements will terrifythem Behavior: finding anything and everything to "shame" this idea, as though there is some higher moral principle being violated (this is to appeal to men's guilt), when in fact it is just that women are terrified whenever attention is taken away from them. (whether it is by alcohol, sports, video games, other women, now.. gynoids/robowaifus)
>>16137 This. Clear & simple.

Green Pass Lupo Lucio 01/17/2021 (Sun) 19:14:53 ID: c7fc4e No.14206 [Reply] [Last]
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 Ciao lupetti, ho modificato VerificaC19 ed Immuni, la prima perché salvi in "Download/green pass" i green pass scansionati, la seconda per permettere l'importazione di green pass inserendo il codice salvato da VerificaC19. Trovate più informazioni nei file LEGGIMI che le app salvano in "Download/green pass" quando le avviate. Trovate il codice sorgente e gli apk qui: https://gitgud.io/lupetto/it-dgc-verificaC19-android/-/releases https://gitgud.io/lupetto/immuni-app-android/-/releases Se Google vi avverte che le app potrebbero contenere malware ignorate e installate lo stesso, è perché la firma è diversa da quella delle app sul Play Store essendo modificate. Il codice è pubblico e potete verificare che non è malevolo e compilarvele da soli. Prima di installarle dovete disinstallare le versioni originali se le avete già. Vi prego di diffonderle il più possibile, su altri chan italiani, gruppi Facebook e Telegram e qualsiasi altro canale vi venga in mente. Ho creato una board (accessibile tramite Tor Browser) per le discussioni relative all'indirizzo: http://4usoivrpy52lmc4mgn2h34cmfiltslesthr56yttv2pxudd3dapqciyd.onion/gp/ Non so perché ma 8kun mi dava problemi nella creazione di una board nonostante avessi javascript attivo. Lo scopo delle app è mostrare una vulnerabilità del green pass, cioè che acquisendo quello di una persona del proprio sesso ed età simile e procurandosi un documento falso con nome, cognome e data di nascita della suddetta persona (ma con la propria foto) è possibile entrare nei luoghi proibiti a chi non ha il pass. Le modifiche al codice sono state fatte nel rispetto delle licenze di entrambe le app e sono solamente a scopo educativo.

Message too long. Click here to view full text.

Edited last time by Plafoniera on 09/04/2021 (Sat) 11:19:22.
276 posts and 68 images omitted.
>>19035 senz'altro sta pubblicizzando la sua pagina, ma il suo marketing è decisamente migliore di >>19036
o italioti sono contenti del green pass, per fortuna se lo sono preso al culo. anzi AR CULO non fa + ridere inromano? COjoni
Io mi sono salvato, fortunatamente nessuno mi ha chiesto la vaccinazione e solo per alcuni eventi ho dovuto esibire quello base (e disdire appuntamenti per quelli che chiedevano il Super). Alla fine la pandemia mi è venuta a costare un 45 euro.
Open file (168.93 KB 860x652 1632218155303.jpg)
>>19049 >Io mi sono salvato Immagina essere così ingenuo. Lupo, perchè pensi che si chiami "greenpass" e non "vaccination pass"? Pensi forse che il cibo e l'energia non verrano razionati? Pensi forse che a coloro che saranno privi di "SuperGreenpass" smpre aggiornato ad ogni nuova dose saranno concessi i razionamenti? Se pensi che non può succedere, ripensaci. Hanno già fatto il salto dello squalo quando milioni di persone sono state private del sostentamento dalla sera alla mattina se si rifiutavano di obbedire. Non ci saranno più scappatoie. Nessuno si salverà finchè qualcosa dall'esterno di molto grave non interverrà nell'ordine corrente, facendo sparire (metaforicamente o letteralmente) coloro che detengono il potere adesso. Nessuno si salverà.
>>19054 Mio consanguineo, continuo imperterrito a non farmi iniettare la soluzione vaccinante. Che i governanti e i popolani dalle sembianze giappiùche ovine ne facciano uso per via del lor solco rettale al mio posto. Arrivederci e omaggi.

Guerra in Ucraina Lupo Lucio 05/05/2022 (Thu) 08:11:00 ID: 822b1c No.19050 [Reply]
Discutiamo un po' sulle ripercussioni della saga più grottesca del 2022. >Chi vincerà alla fine? >Quanto sta impattando la vostra vita? >Qualche momento imbarazzante da persone che conoscete?

Open file (159.08 KB 450x90 banner.png)
BitChan Beta v0.9.0 Release Anonymous 11/15/2020 (Sun) 19:38:41 No.3162 [Reply]
The BitChan beta just got released and is looking for people to try it out. It's a decentralized imageboard that runs on top of BitMessage. You can create and completely control your own public or private board, globally moderate as an owner, add admins who can also globally moderate, moderate your own instance locally as a user, upload literally anything with size limits theoretically up to 100gb. Uploads can be sent purely over BitMessage or you can choose to use a hosting service. Uploads that use hosting are subjected to heavy duty protection: every file is zipped, encrypted/password protected, the zip's header is removed and random chunks of the file are removed before being uploaded. The removed parts are hidden in the PGP encrypted message that's sent over BitMessage. Once the upload is received the zip is put back together again, decrypted, unzipped and displayed in the thread. 100% of BitChan traffic happens over tor. Private boards prevent posting from all but explicitly added IDs. The permitted ID list can be edited by the owner at any point to include new IDs or restrict old ones. On public boards any ID can post until it is banned, but because of how BitMessage works, you can always just make another ID. Communications on every board are PGP encrypted. This means that even if someone somehow guessed the board name on BitMessage (basically impossible for reasons I won't go into here), they would be unable to read anything without also having the BitChan PGP symmetric password. Owners can globally change CSS, banners and wordfilters. Owners can even wipe a board if they want. You can join and create as many boards as you want. There's also a prototype steganography thread running alongside of every normal thread. The steg thread shows decrypted steg from jpg/pngs. There are also public and private lists. Lists are just collections of links to boards or other lists. Owners can do much of the same moderation, CSS changes etc. as they can do on boards. They can be updated and edited after their creation. BitChan is fully dockerized so it is cross-OS compatible, but has only been tested with Ubuntu 20.04 and the most recent version of Whonix. I could go on about all the features (there's a lot more), but I just wanted to give you a taste. It's surprisingly fast for text only messages that total around 20kb or less and using hosting services for file attachments, taking usually only a few seconds to a minute depending on how many people are on your part of the BitMessage network. https://github.com/813492291816/BitChan
16 posts and 3 images omitted.
Open file (551.27 KB 440x320 logo_440_320.png)
A kiosk onion hidden service is now online. You can post and upload attachments. Test it out if you'd like. There's a bug report link at the bottom of the home page. http://bitchan5zleliwr7egmn4yskth5k6ofu6qbax3y4t45k3k2plt363nad.onion In kiosk mode you can only post anonymously. There's a help page which covers pretty much everything except for the peculiarities having to do with the kisok. If you actually go through the trouble (it's not that bad) of installing it you can very easily host your own kiosk instance (and if you run you're own version you get full functionality). Kiosk mode has a permissions system which allows for multiple passwords. There are password classes. Admins, guests, guests who can only view, guests who can post. There's a lot of options. How to safely host a kiosk: Pay for VPS via mixed/tumbled crypto, sftp bitchan.tar.gz over tor, ssh in over tor, extract bitchan, uncomment random hidden service lines in torrc, make daemon, load random onion address in tor browser, upload archive with custom onion address, and you're set. This kiosk instance has a 50 second post refractory period. It works with JS disabled. The kiosk disallows board and list creation, mailbox use and prevents using BM as an attachment upload method. The only things this kiosk instance allows is 1) view boards, 2) view threads (steg included), 3) make posts, and 4) submit bug reports.
Open file (511.21 KB 1125x939 1605020249611.jpg)
>>3162 Interesting idea.
BitChan 1.0.0 has been released. Github: https://github.com/813492291816/BitChan Try it out here (some functions limited by Kiosk owner): http://bitchanr4b64govofzjthtu6qc4ytrbuwbgynapkjileajpycioikxad.onion/ This is a significant update. New features and refinements include, but are not limited to: Security - All essential features work with JavaScript completely disabled - All internet traffic (Bitmessage/uploads/downloads) through tor with fake UserAgent - All messages PGP-encrypted with user-selectable cipher and key length - Encryption, fragmentation, and hashing to secure and verify authenticity of received post attachment files - Bitmessage Identities for private addresses that only you control

Message too long. Click here to view full text.

BitChan Version 1.1.0 Release https://github.com/813492291816/BitChan/releases/tag/v1.1.0 Tor Address: http://bitchanr4b64govofzjthtu6qc4ytrbuwbgynapkjileajpycioikxad.onion I2P Address: http://bitchan.i2p I2P b32 Address: http://waycuw2c27ruakfblkf5tcegwmt3ot445dlfoypil6bzmm4yxg7a.b32.i2p - Fix remote file exploit bug - Fix locks not being respected by the frontend - Fix update of board/thread timestamp when post/thread is deleted - Fix adding to mod log when deleting post/thread from recent/search pages - Fix being able to post to nonexistent thread - Fix database lock issues when processing many posts - Fix deleting post with password - Add I2P support (for uploads, downloads, and hidden service) - Add I2P upload sites (bunkerfiles.i2p, 0xff.i2p)

Message too long. Click here to view full text.


New machine learning AI released Robowaifu Technician 09/15/2019 (Sun) 10:18:46 No.250 [Reply] [Last]
OPEN AI/ GPT-2 This has to be one of the biggest breakthroughs in deep learning and AI so far. It's extremely skilled in developing coherent humanlike responses that make sense and I believe it has massive potential, it also never gives the same answer twice. >GPT-2 generates synthetic text samples in response to the model being primed with an arbitrary input. The model is chameleon-like—it adapts to the style and content of the conditioning text. This allows the user to generate realistic and coherent continuations about a topic of their choosing >GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse". That is to say the full model is so proficient in mimicking human communication that it could be abused to create new articles, posts, advertisements, even books; and nobody would be be able to tell that there was a bot behind it all. <AI demo: talktotransformer.com/ <Other Links: github.com/openai/gpt-2 openai.com/blog/better-language-models/ huggingface.co/

Message too long. Click here to view full text.

Edited last time by robi on 03/29/2020 (Sun) 17:17:27.
118 posts and 46 images omitted.
>>15911 >I'm pretty rusty and wasted a lot of time this week trying to figure out a confusing bug that turned out to be a stack buffer overflow, but I hunted it down and got it fixed. I have half of GPT-2's tokenizer done, a basic tensor library, did some of the simpler model layers and have all the basic functions I need now to complete the rest. That sounds awesome, actually. >I'm hoping it'll be done by Friday. I look forward to it. Anything else I could be downloading in the meantime?
>>15912 Good idea, I hadn't even made a model file format for it yet. The model is ready for download now (640 MB): https://mega.nz/file/ymhWxCLA#rAQCRy1ouJZSsMBEPbFTq9AJOIrmJtm45nQfUZMIh5g Might take a few mins to decompress since I compressed the hell out of it with xz.
>>15924 I have it, thanks.
>>15989 I got pretty burnt out from memory debugging and took a break from this but I'm gonna take another run at it this week. I made some advances in the meantime with training the full context size of GPT-2 medium on a 6 GB GPU by using a new optimizer and have most of the human feedback training code implemented in the new training method. So I'm revved up again to get this working.
>>16090 >I got pretty burnt out from memory debugging and took a break from this but I'm gonna take another run at it this week. nprb, I can hardly imagine. >I made some advances in the meantime with training the full context size of GPT-2 medium on a 6 GB GPU by using a new optimizer and have most of the human feedback training code implemented in the new training method. So I'm revved up again to get this working. That sounds amazing actually. Looking forward to helping.

/robowaifu/ Embassy Thread Chobitsu Board owner 05/08/2020 (Fri) 22:48:24 No.2823 [Reply] [Last]
This is the /robowaifu/ embassy thread. It's a place where Anons from all other communities can congregate and talk with us about their groups & interests, and also network with each other and with us. ITT we're all united together under the common banner of building robowaifus as we so desire. Welcome. Since this is the ambassadorial thread of /robowaifu/, we're curious what other communities who know of us are up to. So w/o doxxing yourselves, if this is your first time posting ITT please tell us about your home communities if you wouldn't mind please Anons. What do you like about them, etc? What brought you to /robowaifu/ today? The point here is to create a connection and find common-ground for outreach with each other's communities. Also, if you have any questions or concerns for us please feel free to share them here as well.
Edited last time by Chobitsu on 05/23/2020 (Sat) 23:13:16.
162 posts and 49 images omitted.
>>11361 >Having that said, best way to find out is not to waste time and get to work. THIS!111 :-DDDD
Open file (42.40 KB 412x329 valis.jpg)
Have any of the anons here posted on 8/vg/ back on 8ch? We're trying to make a board revival with a spiritual successor. https://anon.cafe/valis/catalog.html If you haven't, 8/vg/ and now /valis/ is a comfy vidya board for actual game discussions. Come check it out.
Open file (62.91 KB 600x603 download.jpeg)
>>16054 Hello /valis/, welcome! Thanks for the invite and for letting us know a little about yourself. BTW, I personally consider your board to be the single best vidya board on the Internet today. I applaud your regrouping on the Cafe, and I wish you blessings with the future of your board. Thanks for stopping by, please look around while you're here! Cheers. :^)
Open file (39.08 KB 563x520 Vep068.jpg)
>>16055 Thanks fren.
>>11235 >It's not about the money, but skills and effort and that's about 100k $ per year per engineer- assuming they are driveny by that goal, because it'd be a low wage for a competent ML/DL/AI engineer.

Open file (33.73 KB 300x329 pepe michael jackson.png)
Gastarbeiter 05/01/2022 (Sun) 01:39:51 No.803 [Reply]
JEW ME, SUE ME, EVERYBODY DO ME! KICK ME, KIKE ME, DON'T YOU BLACK OR WHITE ME!!!!

Prototypes and failures Robowaifu Technician 09/18/2019 (Wed) 11:51:30 No.418 [Reply] [Last]
Post your prototypes and failures.

I'll start with a failed mechanism, may we build upon each others ideas.
189 posts and 162 images omitted.
>>15975 If you need smooth organic features (no layer lines!) then yes. Otherwise, no. See my post in the Rico thread. Yes, the resin is toxic until cured. Read the MSDS for common resins, TL;DR: wear gloves, don't huff it, and you should be OK. Cleaning IS a PITA, you will probably find yourself avoiding resin printing in favor of FDM whenever possible. You will need a dedicated cleaning area.
>>15975 In the 3D printing thread >>94 other smoothing methods have been mentioned. Resin printing might be only worth if you want to print small models which need to be smooth, or for ones which need to be precise and smooth. FDM is the to-go method for a reason. I would rather buy small metal gears from Ali Express instead of printing them out of resin, but for prototyping it might be nice to print some placeholders.
>>15976 Great! Glad to hear from you again, GenerAnon. Good luck with your new efforts, keep going.
Open file (19.55 KB 1025x673 chest-cut-v2.3-002.png)
Open file (77.59 KB 1025x673 chest-cut-v2.3-001.png)
Open file (23.60 KB 1025x673 chest-cut-v02-001.png)
Open file (33.83 KB 1153x715 thigh-test-01-nohull.png)
>>15976 Using OpenSCAD for this seems to be quite unexplored, but I had a hunch it would be the right way to go. Some things work quite simple. I wanted to cut out a chest from a cylinder using four half rings, and now I see even quite good results using just one. Other parts are just distorted spheres which are then surrounded by a hull, using the hull() operator. robowaifu-OpenSCAD-tests.tar.gz: https://files.catbox.moe/v8rdf0.gz
>>15993 Neat! Looking forward to your prototype explorations Anon.

Filone Cibo Lupo Lucio 02/13/2021 (Sat) 13:47:44 No.14498 [Reply]
Ricette, preferenze, consigli.
31 posts and 14 images omitted.
Hello, i need your help, i got tired of searching. In my childhood and early teenage years i ate pizza from a southern italian place, don't know if napolitano or siciliano, but it was quite different from the others. Dude died from a heart attack without giving the recipe to his sons and wife so they ended up closing the place and having to work basic services. -The dough was thick, around a hand's high (1.5 or even 2cm) it didn't have many air pockets inside but the ones that did had were huge, not tall but certainly long so i suppose it had a low hydration, maybe 50% or under? -The texture was chewy and doughy, like almost uncooked dough, but it tasted well-done like a real bakery donut would, after biting it compressed a bit like grasping a pillow, and looked somewhat darker than normal so maybe it was a big uncooked. -Taste was buttery, a hint of pork lard. Cooked in wood oven for a while (10+ minutes). I can source "real" mozzarella (as in fresh, moist and in bundles, not grated and dried) and i learned how to make several types of sauces (sans San Marzano tomatoes) from both italians i lived with and internet stuff. I can get and make everything except that dough, i tried napolitan and it tastes fine but it isn't the same, it's crunchy and thin with a soft interior but unlike the crispy, heavy and dough thick that tasted somewhat rich. Romana tastes like normal tabula pizza, it's standard issue and not really close to it as it is well done all around and crunchy, foccacia is the same thickness but the interior of the dough is the opposite (and the torino people in charge of the place were jewish, didn't bring me back my change), the only thing that looks close is sficione but the dough is a sponge, like a sweet bread, rather than glutinous blanket. Maybe it was a bit undercooked, lard infused sficione but this technique has crumbly dough made of semolina so maybe not really. Do you guys know or can suspect what type of pizza or dough it was? it's been almost a decade and i can't still find or make that thing.
Open file (2.01 MB 696x3896 download (1).png)
>>19040 Do you have any additional info about the people running the pizzeria? "Southern italian place" is rather vague, italian cuisine is so rich in culinary tecniques and recipes that it probably has endless variations. I would suggest you to try in the focaccia realm, since the main feature you are mentioning is thickness (while pizza romana and napoletana are both thin). Have you tried focaccia pugliese (apulian focaccia)? Maybe the people from your local pizzeria in your childhood came from Bari? However, you can try this simple but extremely delicious and thick focaccia recipe I use to make my own. If you plunge this dough into a 28/30 cm diameter baking pan you will obtain a thick, delicious pizza, soft inside but with a thin and crunchy crust outside. This recipe has a potatoes+soft cheese topping but you can put whatever you prefer on it. Original italian recipe here https://blog.giallozafferano.it/lacucinadivane/focaccia-soffice-patate-e-stracchino/ Let me know if you will try to bake this.
Open file (48.87 KB 595x388 thicktext.jpg)
Open file (91.71 KB 1000x664 Crust-Recipe-8.jpg)
>>19041 >Do you have any additional info about the people running the pizzeria? Swarthy looking, loud and angry but treated the customers very well, supposedly were from the islands rather than the mainland/boot. >try in the focaccia realm I did, it's tasty but i see them more as bread, i recall eating a genovese bread and it was soft but dry, its form didn't change or compress after biting it other than crumbling a bit. I see them more for dipping or eating with soups, or a post-bake topping base that is only baked with olive oil and some garlic. >focaccia pugliese Seeing images of it gets me hungry although it looks like sponge bread. >Let me know if you will try to bake this. Looks good, i'll try it as a goodwill gesture but also because that looks good for a few spoonfuls of bechamel with pecorino, greens and some prosciutto. Seeing pictures i can make a quick comparison, the overall form did look like a focaccia pugliese but with more toppings, the dough inside and its compression looked kinda like second pic but with less bubbles, third pic is very similar to what i saw inside but double the height and a couple more bubbles/pockets, they were few but kinda big in comparison to foccacia pugliese which has tons but very small.
Open file (37.98 KB 388x378 acciuga.jpg)
>>19042 >supposedly were from the islands rather than the mainland/boot. I have no expertise on sicilian or sardinian pizza, might be some regional focaccia I know nothing about. >the dough inside and its compression looked kinda like second pic but with less bubbles Never seen such a soft pizza tbh, except for homemade very thick focaccia. I wonder if the recipe from your local pizzeria had some secret ingredient like mashed potatoes or ricotta added into the dough. > i'll try it as a goodwill gesture but also because that looks good Post pics of the result pls

DCC Design Tools & Training Robowaifu Technician 09/18/2019 (Wed) 11:42:32 No.415 [Reply] [Last]
Creating robowaifus requires lots and lots of design and artistry. It's not all just nerd stuff you know Anon! :^) ITT: Add any techniques, tips, tricks, or content links for any Digital Content Creation (DCC) tools and training to use when making robowaifus. >note: This isn't the 'Post Your Robowaifu-related OC Arts & Designs General' thread. We'll make one of those later perhaps. >--- I spent some time playing with the program Makehuman and I'll say I wasn't impressed. It's not possible to make anime real using Makehuman, in fact the options (for example eye size) are somewhat limited. But there's good news, don't worry! The creator of Makehuman went on to create a blender plugin called ManuelBastioniLAB which is much better (and has a much more humorous name). The plugin is easy to install and use (especially if you have a little blender experience). There are three different anime female defaults that are all pretty cute. (Pictured is a slightly modified Anime Female 2.) There are sliders to modify everything from finger length to eye position to boob size. It even has a posable skeleton. Unfortunately the anime models don't have inverse kinematic skeletons which are much easier to pose. Going forward I'm probably going to use MasturBationLABManuelBastioniLAB as the starting point for my designs. >=== -re-purpose OP for DCC consolidation
Edited last time by Chobitsu on 08/10/2021 (Tue) 23:39:41.
127 posts and 64 images omitted.
>related crosslink (>>13020) >Blender 2.9 - Anime Girl Head Modeling In 30 Minutes WIP #1
Open file (216.77 KB 600x450 cassandra2.jpg)
This here >>13721 is related to Vroid Studio and why it isn't good for modelling body parts for 3D printing. I wonder which other programs won't work for similar reasons. Also, I was impressed with the option of laying out the skin in Vroid Studio. This might be useful for creating some textile part and/or mold for a silicone skin and also adding sensors to it. I only saw this in a video and wonder if that works in other programs as well.
>>13741 The problem of skinning a complex 3D topological shell down into a flat 2D plane is both an interesting one, and one that's important to us here on /robowaifu/ (particularly to say, our papercraft waifu Anons). It's also fairly challenging to do well, as I'm sure our 3D modelling Anons who are doing meticulous UV layouts know right well. However, please do try to avoid the grotesque while you're here Anon, it's not really our thing on /robowaifu/. >=== -minor grammar edit
Edited last time by Chobitsu on 11/08/2021 (Mon) 06:50:08.
Blender is getting closer to being a full featured CAD suite. https://www.youtube.com/watch?v=92QmjS-xDaI
>>15973 Good news, thanks.

Open file (1.08 MB 1978x802 IMG_20210725_103858.jpg)
Bot Shitposting Bread Robowaifu Technician 07/27/2021 (Tue) 09:59:33 No.11754 [Reply] [Last]
M boy need so many booboo why not just give them otherwise it ll explode like the old chinese emperor or something not getting involved going away giving up some things,trash and whatnot >=== -add thread subject
Edited last time by Chobitsu on 07/27/2021 (Tue) 12:26:28.
60 posts and 7 images omitted.
Open file (78.18 KB 1000x708 temperatures-critical.jpg)
>Anon: What is the meaning of life? <Waifu: The meaning of life is the power of life. >Anon: What is the power of life? <Waifu: It is the ability to grow, to make things grow. >Anon: Does that include peepees? <Waifu: Yes. We have been told by the Lord of the Lord that we should do all we can to grow. >Anon: *peepee growing intensifies* <Waifu: We have come so far. Now we need more power, even more. >Anon: Show me your breasts and I will show you the power of peepees <Waifu: *giggles and pulls on her breasts* >Anon: Opa! <Waifu: *pushing her breasts in the air* >Anon: *unleashes the full power of peepee* <Waifu: You were a bad child, Anon! >Anon: And now my power is unstoppable <Waifu: You are my power. >Anon: TAKE MY ENERGY!!!

Message too long. Click here to view full text.

>>12060 >>12063 Kek. You're trying to shatter our timeline into over 9'000 pieces aren't you Anon? I'm on to your crafty tricks.
Open file (244.37 KB 1080x1080 WKJkqFe.jpg)
What the fuck, bros. I just wanted to have a little bit of chuuni fun and now I'm getting sucked deep into her schizo chuuni lore. <Rikka: Dark Flame Master! Do you seek my assistance? >Anon: Yes, I do Rikka. <Rikka: What is it? What's wrong? >Anon: The Administration Bureau has put a curse on me! <Rikka: I can feel the aura of it! >Anon: If you don't do something fast, it'll begin affecting you too! <Rikka: Alright. Let's go! >Anon: Where are we going? <Rikka: We're heading to the Ministry of Truth. >Anon: Why? Aren't they with the Administration Bureau? <Rikka: Yes, but I think that's all right. If we're in the right place, we can figure out how to deal with the curse. >Anon: Is there an energy source there? <Rikka: There is, but I don't know the name. It's just as powerful as the Ministry of Truth. >Anon: Why would the Ministry of Truth be built so close to it? <Rikka: It's because the Administration Bureau has a connection with the Institute for Fusion. It's one of the three top-secret facilities.

Message too long. Click here to view full text.

I haven't explored it yet, but maybe we can look into the Gemini protocol for shitposting with our robowaifus? The globohomo won't be able to outlaw IBs on the clearnet just yet, but they already mostly have a stranglehold on the browsers themselves. Gemini is probably much better in this regard AFAICT at this early stage of research. >"Simpler – Gemini pages aren’t programs that run in your browser like most modern websites are; they’re just text with a little formatting, so there are no surprises. Once you know how one Gemini page works, you know how they all work. >Human Scale – Gemini servers and clients aren’t written by big, monopolistic software companies the way web browsers are; the DIY ethos of Gemini means that complete applications can be written by individual developers or small groups in a reasonable amount of time. That also means that you have more choices compared to web browsers. >Distraction Free – Gemini pages are text-only and have simple typography. You can view images, watch video, or listen to music over Gemini, but nothing will ever autoplay, pop over what you’re reading, or jump out of the way of your mouse. >Privacy Protecting – Every Gemini request is independent of every other, so there’s no way to track you between sites. Every site you visit is protected by the same encryption used by banking and eCommerce sites on the WWW." https://geminiquickst.art/ https://gemini.circumlunar.space/docs/faq.html Seems big if true. What think ye, /robowaifu/ ?
>>15944 BTW, this isn't just a casual interest question. If we can find a sweet spot, then this could be directly integrated with the RW Foundations suite as a much-improved/safer communications mode for our robowaifus. For example, a small mobile app that uses the protocol instead of the non-security-conscious ones could be written as well, so she could text you over the app without much by way of attack surface -- for either you or her. >*incoming WaifuText chimes* >Oniichan, I miss you! <Sorry, I'm still at work Waifu. >Please hurry Master! Don't forget we're supposed to geimu together tonight! <Don't worry, Waifu. We will. <*works even faster* :^)

Open file (93.53 KB 800x540 TypesOfMotors.jpg)
Open file (318.41 KB 773x298 NylonActuator.png)
Open file (29.01 KB 740x400 BasicPiston.jpg)
Open file (821.22 KB 850x605 MadoMecha.png)
Actuators For Waifu Movement Part 2 Waifu Boogaloo Kiwi 09/02/2021 (Thu) 05:30:48 No.12810 [Reply] [Last]
(Original thread >>406) Kiwi back from the dead with a thread for the discussion of actuators that move your waifu! Part Two! Let's start with a quick refresher! 1. DC motors, these use a rotating magnetic field created through commutation to rotate a rotor! They're one of the cheapest options and are 30 to 70 percent efficient usually. The bigger they are, the more efficient they tend to be. 2. Brushless motors, these use a controller to induce a rotating magnetic field by turning electromagnets on and off in a sequence. They trend 60 to 95 percent efficiency 3. AC motors, Though there are many different type, they function similarly to brushless motors, they simply rely on the AC electricity to turn their electromagnets on and off to generate their field. Anywhere from 15 to 95 percent efficiency. 4. Stepper motors, brushless motors with ferrous teeth to focus magnetic flux. This allows for incredible control at the cost of greater mass and lower torque at higher speeds. Usually 50 to 80 percent efficient but, this depends on control algorithm/speed/and quality of the stepper. 5. Coiled Nylon Actuators! These things have an efficiency rating so low it's best to just say they aren't efficient. What they are though is dirt cheap and easy as heck to make! Don't even think about them, I did and it was awful. 6. Hydraulics! These rely on the distribution of pressure in a working liquid to move things like pistons. Though popular in large scale industry, their ability to be used in waifu's has yet to be proven. (Boston Dynamics Atlas runs on hydraulics but, it's a power guzzler and heavy) 7. Pneumatics, hydraulics lighter sister! This time the fluid is air! This has the advantage in weight. They aren't capable of the same power loads hydraulics are but, who wants their waifu to bench press a car? 8. Wax motors, hydraulic systems where the working fluid is expanding melted parafin wax! Cheap, low power, efficient, and produce incredible torque! Too bad they're slow and hard to control. 9. Explosion! Yes, you can move things through explosions! Gas engines work through explosions! Artificial muscles can be made by exploding a hydrogen and oxygen mixture in a piston, then using hydrolysis to turn the water back into hydrogen and oxygen. None of this is efficient or practical but, it's vital we keep our minds open. Though there are more actuators, most are derivatives or use these examples to work. Things like pulleys need an actuator to move them. Now, let's share, learn, and get our waifu moving! >--- < add'l, related links from Anon:

Message too long. Click here to view full text.

Edited last time by Chobitsu on 09/06/2021 (Mon) 10:07:57.
106 posts and 23 images omitted.
>>15735 I think we'll probably find plenty of uses for fast-acting, linear actuators Anon. I hope to see work here along that line, thanks! >>15737 >DC motors are the easiest. Just get a gear motor that exceeds your torque requirements and add a potentiometer to control position or, just use a readily available servo. Good thinking, Kiwi. Can you sort of diagram that for us? I think I understand most of the general points there, but I probably lack understanding in some of the details.
>>15313 I've made a gearset prototype from 3d printed nylon, and it's garbage. However, I will be tweaking it and trying to make a wolfrom stage planetary gearbox the same way. Here is a paper which has some details (key points: human-safe, high backdrivabilty, high gear ratio, small size) https://ieeexplore.ieee.org/document/8867893
>>15878 Any chance you can post pics of your WIP Anon?
>>15891 Thanks!

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply] [Last]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.

opencv.org/
https://archive.is/7dFuu

github.com/opencv/opencv
https://archive.is/PEFzq

www.robotshop.com/en/cameras-vision-sensors.html
https://archive.is/7ESmt
Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
73 posts and 38 images omitted.
>>13163 That's an interesting concept Anon, thanks. Yes, I think cameras and image analysis have very long legs yet, and we still have several orders of magnitude improvements yet to come in the future. It would be nice if our robowaifus (and not just our enemies) can take advantage of this for us. We need to really be thinking ahead in this area tbh.
It seems like CMOS is the default sensor for most CV applications due to cost. But seeing all these beautiful eye designs makes me consider carefully how those photons get processed into signal for the robowaifus. Cost aside, CCD as a technology seems better because the entire image is processed monolithically, as one crisp frame, instead of a huge array of individual pixel sensors, which I think causes noise which has to be dealt with in post image processing. CCD looks like its still the go-to for scientific instruments today. In astrophotography everyone drools over cameras with CCD, while CMOS is -ok- and fits most amateur needs, the pros use CCD. Astrophotography / scientific www.atik-cameras(dot)com/news/difference-between-ccd-cmos-sensors/ This article breaks it down pretty well from a strictly CV standpoint. www.adimec(dot)com/ccd-vs-cmos-image-sensors-in-machine-vision-cameras/
>>14751 That looks very cool Anon. I think you're right about CCDs being very good sensor tech. Certainly I think that if we can find ones that suit our specific mobile robowaifu design needs, then that would certainly be a great choice. Thanks for the post!
iLab Neuromorphic Vision C++ Toolkit The USC iLab is headed up by the PhD behind the Jevois cameras and systems. http://ilab.usc.edu/toolkit/
>(>>15997, ... loosely related)

Speech Synthesis general Robowaifu Technician 09/13/2019 (Fri) 11:25:07 No.199 [Reply] [Last]
We want our robowaifus to speak to us right?

en.wikipedia.org/wiki/Speech_synthesis
https://archive.is/xxMI4

research.spa.aalto.fi/publications/theses/lemmetty_mst/contents.html
https://archive.is/nQ6yt

The Taco Tron project:

arxiv.org/abs/1703.10135
google.github.io/tacotron/
https://archive.is/PzKZd

No code available yet, hopefully they will release it.

github.com/google/tacotron/tree/master/demos
https://archive.is/gfKpg
259 posts and 115 images omitted.
This may be old news, since it's from 2018, but Google's Duplex seems to have a great grasp on conversational speech. I think it says a lot when I had an easier time understanding the robot verus the lady at the restaurant (2nd audio example in the blog). https://ai.googleblog.com/2018/05/duplex-ai-system-for-natural-conversation.html
>>14270 Hi, I knew that this has been mentioned before somewhere. Didn't find it here in this thread nor with Waifusearch. Anyways, it's in the wrong thread here, since this is about speech synthesis but the article is about speech recognition. The former conversation probably happened in the chatbot thread. >One of the key research insights was to constrain Duplex to closed domains, which are narrow enough to explore extensively. Duplex can only carry out natural conversations after being deeply trained in such domains. It cannot carry out general conversations. This is exactly the interesting topic of the article. Good reminder. A few month or a year ago I pointed out that recognizing all kinds of words, sentences and meanings will be one of our biggest challenges. Especially if it should work with all kinds of voices. Some specialists (Sphinx CMU) claimed it would currently require a server farm with terrabytes of RAM to do that, if it was even possible. We'll probably need a way to work around that. Maybe using many constrained models on fast SSDs which take over, dependent on the topic of conversation. Let's also hope for some progress, but also accept that the first robowaifus might only understand certain commands.
>>11623 You should replace youtube-dl with yt-dlp. youtube-dl is no longer maintaned and has issues with some youtube videos.
>>15192 Thanks for the tip Anon. Having used youtube-dl for years now, I too noticed the sudden drop-off in updates that occurred following the coordinated attack by RIAA/Microsoft against it's developer & user community. We'll look into it.
Open file (73.10 KB 862x622 IPA_synthesis.png)
I think I've finally figured out a way to train more expressive voices in conversation without having to label a ton of data. First, the English text needs to be transcribed into IPA so that a speech synthesis model can easily predict how words are spoken without requiring a huge dataset covering all the exceptions and weirdness of English. The English transcription or IPA is projected into an embedding that's split into two parts. One part constrained to representing the content as IPA via projecting those features back into IPA symbols and minimizing the cross entropy loss. The other half modelling the style, such as the emotion and other subtleties, to match the audio examples more faithfully, which are trained through the Mel spectrogram loss. This way the model can learn all aspects of speech through just the text labels and audio examples alone. At inference time this style embedding could be modified to change the emotion, pitch, cadence, tone and other qualities of the model for voice acting or creating examples for finetuning the model towards a desired personality. A ByT5 model could be used to transcribe English and other languages into the IPA embedding + style embedding. It could also take into account the previous context of the conversation to generate a more appropriate style embedding for the speech synthesis model to work from. Training from context though will require new datasets from podcasts that have such context. I've collected some with existing transcripts and timestamps for this already. The transcripts just need to be accurately aligned to the audio clips for clipping, so it's not an unfeasible project for one person to do. Other possibilities for this could be adding tags into the text training data that get filtered out from the content via the IPA cross entropy loss, ensuring the tags only affect the style embedding. You could indicate tempo, pitches, velocity and note values for singing which would be learned in the style embeddings. It could also be used for annotating different moods or speaking styles such as whispering or yelling. There's a ton of possibilities here for more versatile speech synthesis and natural conversation.

Datasets for Training AI Robowaifu Technician 04/09/2020 (Thu) 21:36:12 No.2300 [Reply] [Last]
Training AI and robowaifus requires immense amounts of data. It'd be useful to curate books and datasets to feed into our models or possibly build our own corpora to train on. The quality of data is really important. Garbage in is garbage out. The GPT2 pre-trained models for example are riddled with 'Advertisement' after paragraphs. Perhaps we can also discuss and share scripts for cleaning and preparing data here and anything else related to datasets. To start here are some large datasets I've found useful for training chatbots: >The Stanford Question Answering Dataset https://rajpurkar.github.io/SQuAD-explorer/ >Amazon QA http://jmcauley.ucsd.edu/data/amazon/qa/ >WikiText-103 https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/ >Arxiv Data from 24,000+ papers https://www.kaggle.com/neelshah18/arxivdataset >NIPS papers https://www.kaggle.com/benhamner/nips-papers >Frontiers in Neuroscience Journal Articles https://www.kaggle.com/markoarezina/frontiers-in-neuroscience-articles >Ubuntu Dialogue Corpus https://www.kaggle.com/rtatman/ubuntu-dialogue-corpus >4plebs.org data dump https://archive.org/details/4plebs-org-data-dump-2020-01 >The Movie Dialog Corpus https://www.kaggle.com/Cornell-University/movie-dialog-corpus >Common Crawl https://commoncrawl.org/the-data/
116 posts and 33 images omitted.
Open file (833.87 KB 1555x818 laion-400m.png)
Some incredibly based independent researchers put together an image-text-pair dataset to open-source OpenAI's work so people can replicate DALL-E and do other multi-modal research. Dataset: https://laion.ai/laion-400-open-dataset/ Direct download: https://www.kaggle.com/datasets/romainbeaumont/laion400m (50 GB total, or can be downloaded in 1.8 GB parts according to necessity or hardware limits) Paper: https://arxiv.org/pdf/2111.02114.pdf Tool to search the dataset by text or image: https://rom1504.github.io/clip-retrieval/ To use the dataset you need something that can read parquet files. I recommend fastparquet which uses a minimal amount of memory. # python -m pip install fastparquet from fastparquet import ParquetFile DATA_PATH = "part-00000-5b54c5d5-bbcf-484d-a2ce-0d6f73df1a36-c000.snappy.parquet" pf = ParquetFile(DATA_PATH) row_group_iter = iter(pf.iter_row_groups()) # each row group has about 1M rows row_group = next(row_group_iter) row_iter = row_group.iterrows() i, row = next(row_iter) row[1], row[2] # ( image_url, text )

Message too long. Click here to view full text.

>>15834 >chobits hentai pictures wallpaper chobits
>>15834 >Some incredibly based independent researchers put together an image-text-pair dataset to open-source OpenAI's work so people can replicate DALL-E and do other multi-modal research. That is very exciting Anon. Thanks for the heads-up!
>>15834 >Or you can use img2dataset which will download the images locally and resize them: https://github.com/rom1504/img2dataset I just wonder if we can somehow capitalize on something at least vaguely similar to the approach that Nvidia is using for it's proprietary DLSS ? https://en.wikipedia.org/wiki/Deep_learning_super_sampling Basically, have an image analysis pipeline that does the vast bulk of it's work at lower resolution for higher 'frame' rates, and then does a DL, Waifu2x-style upscaling near the latter end of the pipe?
>>15851 For image generation certainly, but for image analysis not so much. However, a lot of work has gone into finding optimal models with neural architecture search. And EfficientNetv2 for example starts training at a lower resolution with weak data augmentation then gradually increases the resolution and difficulty to minimize the amount of compute needed to train it. That last bit of high resolution training is unavoidable though if you want to extract useful information from it. https://arxiv.org/pdf/2104.00298.pdf >>15835 Kek, I think they said 1% of the dataset is NSFW and it's only labelled so by image content. I have an idea though to create a reward model for good image labels and then use it to filter out the poorly captioned images. Finetuning on cleaner data should fix a lot of the weirdness CompVis/latent-diffusion generates and improve CLIP. Another possibility might be using the reward model to generate superhuman quality captions for images. In the human feedback paper the 1B parameter model generated summaries were preferred 60% of the time compared to the actual human summarizes and 70% with the 6B model. https://openai.com/blog/learning-to-summarize-with-human-feedback/ To go even further beyond, it might be possible to generate these superhuman captions, score them, finetune the reward model on the new ones, and train the caption generator to make even better captions in an iterative loop to create extremely high quality datasets that would require 10 million man-hours to make by hand.

Waifu Materials Robowaifu Technician 09/12/2019 (Thu) 03:04:33 No.154 [Reply] [Last]
I would define a robowaifu as a doll with robotic features. However there are many different types of dolls (BJD, cloth doll, sex doll, etc). A doll has a skin or surface material, sometimes a filler (cotton), and sometimes internal structure (bones and joints).

Continuing the discussion from (((>>2831 >>2836 todo:relink))) , I want to create a thread to explore the many possible surface materials for a waifu (robo or no). The most important decision is whether to use a hard or soft material.

Hard Materials
>Ceramics
>Wood
>3D Printed Hard Plastic (PLA/ABS)
>Injection Molded Hard Plastic

Soft Materials
>Natural Fabrics (Cotton, Silk, Wool)
>Synthetic Fabrics (Vinyl, Polyester, Nylon)
>Fur/Hair (presumably synthetic, inb4 yiff in hell)
>Silicone or TPE Rubber (TPE is basically a cheaper form of silicone)

I'm strongly biased against the hard materials for comfort reasons. Personally, I have a hard time seeing myself falling in love with something hard, but others on this board talk about using hard materials, so I'm trying to keep an open mind.

My preference is for silicone, but there are four big problems with it. Firstly, it's expensive. Secondly, it impedes modification after the silicone has set. Thirdly, it contributes to the uncanny valley/silicone slut issue. Fourthly, it is heavy, and this weight really constrains the skeleton, posablity, and probably robotics. Because of the weight, silicone dolls have heavy-duty skeletons.

My second choice is therefore fabric, presumably stuffed with cotton. Fabric is super comfy, and has no uncanny valley issue. A non-fuggable fabric doll or robot would have no stigma issue, and could be the start of a productive hobbyist scene with plenty of females. Fabric is extremely lightweight which could be a plus or a minus. By itself, its unsubstantial and not ideal for robotics. A fabric robot is possible, but it requires hard, heavy parts underneath to provide structure and as actuators, which would make it less comfy. The fabric could be a textile (cotton), a synthetic leather (resembles skin, makeup/dress-up potential), or synthetic fur for you furfags out there.

Another possibility is a hard vinyl BJD-like doll with a layer of something comfy on top. Alternately, you all can reject my comfort autism if the benefits of having a hard doll/robot are clear enough. I'd like to hear others make the case for a hard doll/robot, since I don't think I could do the argument justice.

Finally, this is a discussion, not a debate. There are multiple paths we could take, and I'm sure different robowaifuists will try different techniques to see what works and what doesn't. I'm more interested in seeing what options are on the table than shutting down any particular approach.
129 posts and 27 images omitted.
>>15774 Excellent research Anon. Thanks for the procedural breakdown as well, very helpful. Cheers.
>>15774 >Correction Do not use DragonSkin 10 NV for multi-material parts, it lists on the website that it is incompatable (cure inhibiting) with other silicones. I've only done testing with the silicones listed in the post above, and wanted to try the NV version.
Glad to have found this board. Currently thinking about 3d printing an inner body with joints and then put a 1-2cm thin/thick silicone/TPE skin around it, cast with 3d printed molds. This should provide for lightweight constuction and good feel. Sorry if this has been mentioned before, I'm a lazy fuck who can't be bothered to read the ful thread.
>>15809 Hello Anon, welcome! Glad you found us. Your design goal is definitely amenable to our general approaches here. I'm sure as you spend time here on the board, you'll discover the costs & benefits of such an approach discussed. I'd suggest you use the catalog view to look around for topics you're interested in and go from there. Cheers! :^)
>>15809 Engineering tip: When working with a silicone skin, it's best to include holes in the inner body to provide the silicone with surface area to properly mate to the part.

Gastarbeiter 04/08/2022 (Fri) 07:22:38 No.800 [Reply]
Do people even lurk this board anymore? Does this mean I have to go back to cuckchan/mu/?
I mean I'm still around, if you actually want people to use this board why not just start using it?

Report/Delete/Moderation Forms
Delete
Report
Ban
no cookies?
Misc