/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Happy New Year!

The recovered files have been restored.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


“What counts is not necessarily the size of the dog in the fight – it’s the size of the fight in the dog.” -t. General Dwight Eisenhower


Open file (13.41 KB 750x600 Lurking.png)
Lurk Less: Tasks to Tackle Robowaifu Technician 02/13/2023 (Mon) 05:40:18 No.20037 [Reply]
Here we share the ideas of how to help the development of robowaifus. You can look for tasks to improve the board, or ones which would help to move the development forward. You could also come up with a task that needs to be worked on and ask for help, use the pattern on top of OP for that, replace the part in <brackets> with your own text and post it. >Pattern to copy and adjust for adding a task to the thread: Task: <Description, general or very specific and target thread for the results> Tips: <Link additional information and add tips of how to achieve it.> Constraints and preferences: <Things to avoid> Results: Post your results in the prototypes thread if you designed something >>18800, or into an on-topic thread from the catalog if you found something or created a summary or diagram. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800, current meta >>18173 and many others in the catalog https://alogs.space/robowaifu/catalog.html - the thread for posting the result might also be the best place to discuss things. >General suggestions where you might be able to help: - Go through threads in the catalog here https://alogs.space/robowaifu/catalog.html and make summaries and diagrams like pointed out starting here >>10428 - Work on parts instead of trying to develop and build a whole robowaifu - Work on processes you find in some thread in the catalog https://alogs.space/robowaifu/catalog.html - Test existing mechanisms shared on this board, prototypes >>18800 - Try to work on sensors in some kind of rubber skin and in parts >>95 >>242 >>419 - Keep track of other sites and similar projects, for example on YouTube, Twitter or Hackaday. - Copy useful pieces of information from threads on other sites and boards talking about "sexbots", "chatbots", AI or something similar. Pick the right thread here: https://alogs.space/robowaifu/catalog.html

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/08/2023 (Mon) 11:17:16.
23 posts and 6 images omitted.
>>33524 Lolno. I WILL NOT EAT ZE BUGS! <---> Somebody need to do a mashup with BB=The evil, bald, jewish Globohomo guy. :D

Welcome to /robowaifu/ Anonymous 09/09/2019 (Mon) 00:33:54 No.3 [Reply]
Why Robowaifu? Most of the world's modern women have failed their men and their societies, feminism is rampant, and men around the world have been looking for a solution. History shows there are cultural and political solutions to this problem, but we believe that technology is the best way forward at present – specifically the technology of robotics. We are technologists, dreamers, hobbyists, geeks and robots looking forward to a day when any man can build the ideal companion he desires in his own home. However, not content to wait for the future; we are bringing that day forward. We are creating an active hobbyist scene of builders, programmers, artists, designers, and writers using the technology of today, not tomorrow. Join us! NOTES & FRIENDS > Notes: -This is generally a SFW board, given our engineering focus primarily. On-topic NSFW content is OK, but please spoiler it. -Our bunker is located at: https://trashchan.xyz/robowaifu/catalog.html Please make note of it. -The Webring's general mustering point is located at: https://junkuchan.org/shelter/index.html -Library thread (good for locating terms/topics) (>>7143) > Friends: -/clang/ - currently at https://8kun.top/clang/ - toaster-love NSFW. Metal clanging noises in the night. -/monster/ - currently at https://smuglo.li/monster/ - bizarre NSFW. Respect the robot. -/tech/ - currently at >>>/tech/ - installing Gentoo Anon? They'll fix you up.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 01/30/2025 (Thu) 03:06:36.

Open file (46.39 KB 458x620 eve preview.jpg)
My Advanced Realistic Humanoid Robot Project - Eve Artbyrobot 04/18/2024 (Thu) 17:44:09 No.30954 [Reply] [Last]
So far I have plans to build Adam, Eve, and Abel robots. All of these are Bible characters. This thread will cover the Eve robot. Eve will have no "love holes" because adding those would be sinful and evil. It is a robot, not a biological woman after all and I will view her with all purity of heart and mind instead of using her to fulfill my lusts of my body. Instead I will walk by the Spirit no longer fulfilling the lusts of the flesh as the Bible commands. Eve will be beautiful because making her beautiful is not a sinful thing to do. However, I will dress her modestly as God commands of all women everywhere. This would obviously include robot women because otherwise the robot woman would be a stumbling block to men which could cause them to lust after her which would be a sin. To tempt someone to sin is not loving and is evil and so my robot will not do this. To dress her in a miniskirt, for example, would be sinful and evil and all people who engage in sinfullness knowingly are presently on their way to hell. I don't wish this for anyone. My robot will dress in a way that is a good example to all women and is aimed toward not causing anybody to lust as a goal. My robot will have a human bone structure. It will use either a PVC medical skeleton or fiberglass fabricated hollow bones. My robot will look realistic and move realistic. It will be able to talk, walk, run, do chores, play sports, dance, rock climb, and do gymnastics. It will also be able to build more robots just like itself and manufacture other products and inventions. I realized with just a head and arm, a robot can build the rest of its own body so that is my intention. My robot will use BLDC motors for drones, RC, and scooters that are high speed and low-ish torque but I will downgear those motors with a archimedes pulley system that will be custom made from custom fabricated pulleys that will be bearings based. By downgearing with pulleys, instead of gears, I will cut down the noise the robot makes so it will be as silent as possible for indoor use. By downgearing, I convert the high speed motors into moderate speeds with great torque. BLDC motors with large torque generally are too large in diameter for a human form factor and take up too much volumetric area to be useful which is why I go with the high speed smaller diameter type motors but just heavily downgear them 32:1 and 64:1. My robot will have realistic silicone skin. Thom Floutz -LA based painter, sculptor, make-up artist is my inspiration as it pertains to realistic skin. The skin for my robots has to be at his level to be acceptable. It must be nearly impossible to tell the robot is not human to be acceptable. I will have a wireframe mesh exoskeleton that simulates the volumes and movements of muscle underneath the skin which will give the skin its volumetric form like muscles do. Within these hollow wireframe mesh frameworks will be all the electronics and their cooling systems. All of my motor controllers will be custom made since I need them VERY small to fit into the confined spaces I have to work with. I need LOADS of motors to replace every pertinent muscle of the human body in such a way that the robot can move in all the ways humans move and have near human level of strength and speed.

Message too long. Click here to view full text.

174 posts and 108 images omitted.
>>36917 >This is not what chatgpt said would happen so chatgpt failed me this time. Haha, better get used to that, my fren! PozzGPT is a large language model, a statistical artifact (and one explicitly being 'taught' to (((lie))), at that! :D, not a Mechanical Engineer. We'd all do well to remember these facts. :^) So, I'm not quite sure yet...did your new approach prove sufficient, or no? <---> Regardless, glad to see you continuing to refine your design details unabated, Artbyrobot. This is how you're going to succeed in the end -- simply by not giving up!! Cheers. :^)
>>36919 > Did your new approach proof sufficient? I don’t know yet. I haven’t had a chance to test it. I just wanted to show the progress I made with this tentative new solution. If this spring is not strong enough, I can use two springs. I haven’t finished connecting the other end up to the finger fully yet so I can’t test yet. That will come next.
>>36922 Ahh, got it. Looking forward to seeing how things work out with this Anon. I'm sure you'll figure things out in the end! Cheers. :^)
Open file (383.02 KB 904x648 IMG_2378.jpeg)
Ok so a few minor updates: I have decided that since I am employing tension springs to actively work against the motors in a constant tug-of-war while the motors try to grasp, I'm losing grip strength based on that. To make up for that, I'm going to use a separate motor for the distal-most fingertip joint and the second to distal-most fingertip joint rather than have a single motor do both of these joints. I made these adjustments in my CAD. I will have to change the tubing setup for the grasping tubing of the index finger to reflect this change too. This will also give the fingers even more precision and dexterity in the end - not to mention a massive boost in strength - so it's well worth it. I also decided to use n20 gear motors for the axial rotation of the base of the fingers instead of BLDC motors like everything else since these will only be used when doing the tiniest of micro adjustments and rarely employed - so a little gear noise once in a blue moon for this precision work on a tiny scale should not be that bad. So that's 4 N20 gearmotors going in. These are being used just to save on space taken and pulleys needed a bit. I'm putting these 4 into the forearm in location pictured. Next, when the spring is pulling, I noticed the TPFE guidance tubing goes from straight and relaxed to wavy under the tension. It is trying to compress under the friction which is what causes this. In the worst cases, Will Cogley's robot hand project had this same issue and the tubing literally compacted so much near the ends that it developed wrinkles/folds where it was crushing the tubing and destroying itself under the pressure. Mine is not to that extreme but this is WHY people put metal coils around the tubing for bike brakes to prevent crushing forces onto the tubing. I don't think I will need this but I might put it in certain places as a last ditch effort if needed later. That said, to prevent some of this compaction stuff on the spring's tubing, I'm going to be using TWO tubes which will divide up these forces causing this by 2. Sharing the load between them evenly. So the tension spring will have two fishing lines coming off of it and two tubes to guide that line to the finger joint where it does it's thing.
>>37130 >To make up for that, I'm going to use a separate motor for the distal-most fingertip joint and the second to distal-most fingertip joint rather than have a single motor do both of these joints. So, I think I'm understanding you to say each of the 4 fingers on the hand will have 2 separate actuation motors? How about the thumb? Any specifics defined yet for how you'll be approaching the 'opposable thumb' type motion? >I'm putting these 4 into the forearm in location pictured. This should afford you some volume for sound-deadening shrouding. Any plans yet for cooling them? What about cooling in general, BTW? >so it's well worth it. Yes indeedy! :^) <---> >I don't think I will need this but I might put it in certain places as a last ditch effort if needed later >I'm going to be using TWO tubes which will divide up these forces causing this by 2. Sharing the load between them evenly. You were discussing these thin & lightweight spring 'wires' earlier : (cf. >>36662). What about the idea of sliding your tubes down into a larger-diameter one of these, then applying a thin bead of silicone or some other fixant (epoxy?) to rigidize these Bowden cables for you? <--->

Message too long. Click here to view full text.


General Robotics & AI News Thread 5: Gazing Into the Nightmare Rectangle Edition Greentext anon 11/04/2024 (Mon) 05:42:08 No.34233 [Reply] [Last]
Anything related to robowaifus, robotics, the AI industry, and any social/economic issues thereof. and /pol/ funposting containment bread! :D -previous threads: > #1 ( >>404 ) > #2 ( >>16732 ) > #3 ( >>21140 ) > #4 ( >>24081 ) >=== -funposting edit
Edited last time by Chobitsu on 01/27/2025 (Mon) 09:51:47.
221 posts and 31 images omitted.
Turns out, Grok-3 is apparently guilty of double-plus badthink. The Ministry of Truth EU isn't inviting little Grokky in to play with little Sven & Sveta. How did it all come to this, Edolf Muskler!??? <---> Heh, I'd sure like to try this thing out now that I know this. Any anonymous access pathways for it?
>>37100 You can pay with crypto and not give out personal info on openrouter, it has Grok 2 and hopefully 3 will be there soon.
>>37108 Thanks for the protip, Anon. Cheers!
Open file (357.81 KB 1057x1281 snzphecy6fke1.jpeg)
Xisters, buckle up. I'm hoping to see some cool open source tech. I'm less interested in the actual models and more on the research. iirc Microsoft evaluated the DeepSeek R1 model and its paper, and decided thats there's still some secret stuff that DeepSeek didn't put in the paper.
>>37132 Neat! This sounds like an exciting week coming up. Thanks for the headsup, Anon. What we all really need is real transparency regarding the 'intuit' process of the DeepSeek 'mind'. As autists (by and large, one degree or other), we pretty much all wish to be able to carefully-craft our waifu's minds for our desired outcomes, I think. Just like auto mechanics, we need to be able to open things up and 'peek under the hood', so to speak. <---> Also, do you know if the DS researchers are digging into robotics at all? Motion planning & control of robowaifus is very-obviously a high priority goal for us here on /robowaifu/ . Cheers, Anon. >Xisters Lol. Lebbit, pls. We are hearty menfolk here, pureblood Viking stock. :^) Yeeaaarrrghh!! >Do you even September 19th, bro? :D

Open file (590.59 KB 1168x1267 1584278178516.png)
Robowaifu Technician 01/19/2025 (Sun) 00:07:32 No.35816 [Reply] [Last]
Welcome all Nandroids fans to the Emmy thread, for discussing and posting about EtR. Off-topic posts and personal attacks will be deleted. --- Also, be sure to check out Emmy-Pilled's project thread! (>>25306) Important Community Links: Boorus, etc.: https://nandroid.booru.org/index.php Google Docs: https://docs.google.com/spreadsheets/d/1mXuNh9ESedCiDZclVuz9uiL7nTNk3U9SgCE_CRHi3Us/htmlview# Webtoons: https://m.webtoons.com/en/canvas/emmy-the-robot/list?title_no=402201 > previous threads : >>27481 >>26629 >>30919

Message too long. Click here to view full text.

199 posts and 76 images omitted.
>>37112 is going slow becuase its gonna release all on the physical version
>>37117 doubt
Post nans or get bans.
>>37131 talking about the comic is talking about nans you fucking retard
>>37133 Post nans OR Get bans

LLM & Chatbot General Robowaifu Technician 09/15/2019 (Sun) 10:18:46 No.250 [Reply] [Last]
OpenAI/GPT-2 This has to be one of the biggest breakthroughs in deep learning and AI so far. It's extremely skilled in developing coherent humanlike responses that make sense and I believe it has massive potential, it also never gives the same answer twice. >GPT-2 generates synthetic text samples in response to the model being primed with an arbitrary input. The model is chameleon-like—it adapts to the style and content of the conditioning text. This allows the user to generate realistic and coherent continuations about a topic of their choosing >GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse". That is to say the full model is so proficient in mimicking human communication that it could be abused to create new articles, posts, advertisements, even books; and nobody would be be able to tell that there was a bot behind it all. <AI demo: talktotransformer.com/ <Other Links: github.com/openai/gpt-2 openai.com/blog/better-language-models/ huggingface.co/

Message too long. Click here to view full text.

Edited last time by Kiwi_ on 01/16/2024 (Tue) 23:04:32.
459 posts and 118 images omitted.
>>37079 Yeah, about that... I hesitated to make my little joke. I really do hate what kikes + their glowniggers have been able to do to the software & computing industries. Still, we actually have a very strong position against most of their antics today. <---> But I still refuse to play their Terminators-R-Us pay-for-play game with them. And I'd advise every'non here to avoid milcon or other zogbot -oriented work. I'd consider the real costs to be far higher than anything they could possibly pay, tbh. :/
>>37072 Thanks! I wasn't productive, I was going there for a concert. It was great :3 Now I'm back. Wasn't feeling like deving, but after 5 coffees I made some progress. The initial tests with the story mode are great. Here you see the regular chat in the assistant turns, and the agency stuff as user input. I fake broke up with her (db rollback) and the response is great. Mind you, this is Hermes-3-Llama-3.1-8B.Q8_0.gguf. When (ab)using the <think> tag with Dolphin3.0-R1-Mistral-24B-Q4_K_M.gguf I had to regenerate sooo many times because it hallucinated or something, but this just works. Right now the emotions are kind of useless, using the dialoge alone would probably generate a similar answer. The goal is to couple that with a persistant emotion state with a decay to baseline. And use the same subsystem principle for other stuff. I'll experiment with some other subsystems right now. Like some sort of reflection thing where she's "aware" of the subsystems and can reference them.
>>37121 Pretty brutal, Anon. <---> So, in the context of say, a newcomer (me ofc, but others as well), how do you do this? Are there links to documentation or something for it (either for the models themselves, or discussing your own modifications if those are ready yet)? This would probably help other Anons here come up to speed with you, if you had a mind to see that. This is an interesting project Anon. Good luck with it. Cheers.
>>37122 I honestly feel bad every time I have to test if extreme reactions :( Sorry, right now the code is the only documentation. During the bus ride home I started making a presentation on my phone for the whole project. Once all the architecture changes are integrated, I might make a video on youtube going into detail. Problem is, things are changing so fast. Even though its on github, I don't really treat it like a public project with proper commits. I'm glad noone else is helping me. Imagine making some changes and then I push a 36 files changed commit that fucks everything up for you. I should really start making feature branches. I'll post a quick rundown here in the next few days, once I'm sure the changes I'm making right now are working as intended.
>>37123 Sounds great! Really looking forward to it all, Anon. >Imagine making some changes and then I push a 36 files changed commit that fucks everything up for you. I should really start making feature branches. Heh, just imagine what's it's like for Gerganov [1] rn : in about 2 year's time went from a smol set of quiet, personal little projects to now thousands of forks & contributors, and shaking the world of AI today. What a ride! >tl;dr Better buckle up, Anon! You may do something similar. Cheers. :^) --- 1. https://github.com/ggerganov

C++ General Robowaifu Technician 09/09/2019 (Mon) 02:49:55 No.12 [Reply] [Last]
C++ Resources general The C++ programming language is currently the primary AI-engine language in use. >browsable copy of the latest C++ standard draft: https://eel.is/c++draft/ >where to learn C++: ( >>35657 ) isocpp.org/get-started https://archive.is/hp4JR stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list https://archive.is/OHw9L en.cppreference.com/w/

Message too long. Click here to view full text.

Edited last time by Chobitsu on 01/15/2025 (Wed) 20:50:04.
315 posts and 82 images omitted.
>>37096 In this type of case, if this was a large container (say, >10'000 items), then we could also use the par_unseq execution policy tag [1] for no-fuss, optimized, native multi-threaded parallel execution against that container. Eg : >find_if_optimizer_v2.cpp snippets : ... #include <execution> using std::execution::par_unseq; // [ -std=c++17 ] ... std::vector<Joint> joints_big(1'000'000); // one million defaulted Joint 's // TODO: set an example, random joint active. eg : // joints_big[897'128].active = true; // joints_big[897'128].angle = 187.3; ... auto const par_iter = std::find_if(par_unseq, joints_big.cbegin(), joints_big.cend(), [](auto const& joint) { return joint.active; }); ... if (par_iter != joints_big.cend()) // if found, whats it's angle?

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/20/2025 (Thu) 09:58:04.
>Au: A C++14-compatible units library, by Aurora >A C++14-compatible physical units library with no dependencies and a single-file delivery option. Emphasis on safety, accessibility, performance, and developer experience. --- >Au (pronounced "ay yoo") is a C++ units library, by Aurora. What the <chrono> library did for time variables, Au does for all physical quantities (lengths, speeds, voltages, and so on). >Namely: >Catch unit errors at compile time, with no runtime penalty. >Make unit conversions effortless to get right. >Accelerate and improve your general developer experience. >In short: if your C++ programs handle physical quantities, Au will make you faster and more effective at your job. https://aurora-opensource.github.io/au/main/ https://github.com/aurora-opensource/au <--->

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/20/2025 (Thu) 17:59:58.
>>37116 made no difference for me, i benchmarked it and a simple loop is fractionally better everytime with -O2, without optimization its like x10 worse than a loop lol, there must be a lot of overhead when using these iterator things and things you count on getting optimized away, with a loop the cpu already does the optimization for you #include <algorithm> #include <execution> using std::execution::par_unseq; // [ -std=c++17 ] #include <iostream> #include <vector> #include <time.h> using std::cout; struct Joint { bool active = false; int index = 0; }; struct Joint *loop_foind( struct Joint *J, int len ) {

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/20/2025 (Thu) 22:32:31.
>>37126 Great! Thanks for taking the time to check up on me here. It's nice to have other Anons here come forward to engage with C++ development for our robowaifus. Please contribute good C++ ideas for us all here. TWAGMI <---> I'll plan to contriving some actual testing benchmarks at quick-bench.com , utilizing & testing your approach given. (If you'd like to do so, please feel free.) Probably save a year or so's delay atp. :^) Cheers, Anon! :^)

/robowaifu/meta-10: Building our path to the good end. Greentext anon 08/12/2024 (Mon) 05:24:31 No.32767 [Reply] [Last]
/meta, offtopic, & QTDDTOT <--- Mini-FAQ A few hand-picked posts on various /robowaifu/-related topics: --->Lurk less: Tasks to Tackle ( >>20037 ) --->Why is keeping mass (weight) low so important? ( >>4313 ) --->How to get started with AI/ML for beginners ( >>18306 ) --->"The Big 4" things we need to solve here ( >>15182 ) --->HOW TO SOLVE IT ( >>4143 ) --->Why we exist on an imageboard, and not some other forum platform ( >>15638, >>31158 ) --->This is madness! You can't possibly succeed, so why even bother? ( >>20208, >>23969 ) --->All AI programming is done in Python. So why are you using C & C++ here? ( >>21057, >>21091, >>27167, >>29994 ) --->How to learn to program in C++ for robowaifus? ( >>18749, >>19777 ) --->How to bulk-download AI models from huggingface.co ? ( >>25962, >>25986 )

Message too long. Click here to view full text.

Edited last time by Chobitsu on 01/02/2025 (Thu) 09:47:42.
428 posts and 77 images omitted.
I very upset by the leading sentence on the site. "Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality." No mention at all of Foxgirls! A huge travesty! Foxgirls are far cuter than cat girls. https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2F04dd70vnrl041.png :)
>>37042 I don't like that you delete my mention of [redacted], you are not gonna lose money for mentioning them. Maybe this will make you change your mind; in this thread i explain that female hypergamy is growing exponentially and most people are also being born with level 1 autism now: https://neets.net/threads/the-behavioral-sink-on-humans-thread-version-1-1.39107/ I also have other thread where i explain why we will have android robots by 2032, but i think i already showed it to you in the past. >=== -rm recs
Edited last time by Chobitsu on 02/17/2025 (Mon) 15:12:54.
>>37061 >I don't like that you delete my mention of ---, --- and the cheaper ---, you are not gonna lose money for mentioning them. Lol. Always about the money with you, Anon. I assure you, no money is being made here. This is an amateur, anonymous, DIY "workshop" board -- we're all just here to brainstorm, share information, prognosticate, help & encourage one another while working on our robowaifu-oriented projects together. :^) Anyway, I'll gatekeep this board as I see fit; I'm generally quite upfront about it whenever I do. Same goes for Kiwi. I hope you can deal with that; its definitely nothing personal Anon, just looking out for the board's & Anon's interests here. <---> Regardless, thanks for your insights & encouragements. I hope you do well with your Anthro studies, and that you manage somehow to do something positive with that. Sure & certain that field is in need of a LARGE dose of good things! Cheers, Laroi. :^) >=== -prose edit
Edited last time by Chobitsu on 02/17/2025 (Mon) 21:14:54.
>>37060 >Foxgrils Pfft, you silly kitsunes...every'non knows wuffgrils a best! :DD <awooos over a fresh bushel basket of apples* >=== -minor edit
Edited last time by Chobitsu on 02/17/2025 (Mon) 15:45:21.
I'm not seeing some kind of thread suited to Robowaifu Motion Planning, or am I missing it? It seems to me this is a very-important topic, and one that is kind of general as well. That is, we need to plan the motions of: -the robowaifu herself (as a unified system), through say, Anon's flat -the robowaifu's arms & other (internal/onboard) skellington components, as she's going about her day-to-day tasks -other 'objects' in motion around the robowaifu (say, of Anon himself), for her predictive planning purposes. <---> What got me thinking about this need was trying to figure out where to post this link : https://www.youtube.com/watch?v=8D7vpR9WCtw Making Hard C++ Tests Easy: A Case Study From the Motion Planning Domain - Chip Hogg - CppCon 2024 >If you've ever struggled to write tests for domain-specific functions with complicated, real-world inputs, this talk is for you! We'll use the Motion Planning component in a self-driving stack as a case study (although you won't need any prior Motion Planning experience to follow the talk). In building objects for our test inputs, we faced the classic dilemma. If we make the objects simple, they're hardly meaningful, and the tests amount to little more than smoke tests. If we try constructing more realistic objects, it takes tremendous amounts of boilerplate code (which obscures what is actually being tested) --- and what's worse, these finicky construction methods go stale and break easily as implementation details evolve. >There is a better way! To find it, we take our inspiration from a (paraphrased) Kent Beck quote: "First, make the test easy. (Warning: this may be hard!) Then, write the easy test." What this means is investing in full-fledged testing support libraries. First, we build foundational domain-specific APIs: user-friendly paths, poses, and speed profiles. Then, we provide APIs that bridge the gap between what the user pictures in their head when they want to write a test (a "scene"), and the data structures that represent that situation in the software (a sequence of planner input messages on different channels). This "scene description" that the user writes is high level and doesn't depend on implementation details, so it resists going stale, as we'll illustrate with an example involving road construction zones. On the implementation side, the problem decomposes beautifully: each planner input (map, actors, etc.) can be constructed from the information in the scene description, independently of all other inputs, helped along by the paths and speed profiles. >These ideas could be implemented in many languages, but C++ particularly excels at delivering performance, robustness, and flexible, natural APIs. As an example, we'll explain the benefits of describing each planner input with a "smart" tag type, containing both the name and the message type. On the implementation side, variadic templates make it easy to conjure up containers and interfaces that operate on "all planner inputs", eliminating the risk of forgetting to update some callsite when we add a new one. On the interface side, end users see none of this complexity: they can simply use instances of these tag types as "indexes" into these containers in a natural way. This fluency and power makes more complicated test cases possible: it becomes easy to select and "tweak" any planner input to ensure we respond correctly when the messages are malformed, delayed, or absent. Overall, we hope our experience enabling high-quality Motion Planning testing at scale will have lessons that can be adapted to a variety of other domains. <--->

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/20/2025 (Thu) 17:50:37.

Open file (173.41 KB 1080x1349 Alexandra Maslak.jpg)
Roastie Fear 2: Electric Boogaloo Robowaifu Technician 10/03/2019 (Thu) 07:25:28 No.1061 [Reply] [Last]
Your project is really disgusting >=== Notice #2: It's been a while since I locked this thread, and hopefully the evil and hateful spirit that was dominating some anons on the board has gone elsewhere. Accordingly I'll go ahead and unlock the thread again provisionally as per the conditions here: >>12557 Let's keep it rational OK? We're men here, not mere animals. Everyone stay focused on the prize we're all going for, and let's not get distracted. This thread has plenty of good advice in it. Mine those gems, and discard the rest. Notice:

Message too long. Click here to view full text.

Edited last time by Chobitsu on 09/02/2021 (Thu) 18:36:20.
396 posts and 111 images omitted.
Anon finally unlocks the truth behind the cycles. >protip: it's a 5-cycle, not a 4-cycle https://trashchan.xyz/robowaifu/thread/26.html#51 At last we all know the truth!111 :D
>>37015 picture from the link
>>37111 >digits Thanks kindly, NoidoDev.
>commies and sjw Are the same thing now according to zoomers that use words like simp?
>>37119 Lol. >Filthy Commies >Libsh*te SJWs BUT I REPEAT MYSELF I'd kinda like to turn Grok-3 loose here on this topic in our /pol/ funposting zone... but I'll restrain myself. :^) . .. ... for now. :D

DCC Design Tools & Training Robowaifu Technician 09/18/2019 (Wed) 11:42:32 No.415 [Reply] [Last]
Creating robowaifus requires lots and lots of design and artistry. It's not all just nerd stuff you know Anon! :^) ITT: Add any techniques, tips, tricks, or content links for any Digital Content Creation (DCC) tools and training to use when making robowaifus. >note: This isn't the 'Post Your Robowaifu-related OC Arts & Designs General' thread. We'll make one of those later perhaps. >--- I spent some time playing with the program Makehuman and I'll say I wasn't impressed. It's not possible to make anime real using Makehuman, in fact the options (for example eye size) are somewhat limited. But there's good news, don't worry! The creator of Makehuman went on to create a blender plugin called ManuelBastioniLAB which is much better (and has a much more humorous name). The plugin is easy to install and use (especially if you have a little blender experience). There are three different anime female defaults that are all pretty cute. (Pictured is a slightly modified Anime Female 2.) There are sliders to modify everything from finger length to eye position to boob size. It even has a posable skeleton. Unfortunately the anime models don't have inverse kinematic skeletons which are much easier to pose. Going forward I'm probably going to use MasturBationLABManuelBastioniLAB as the starting point for my designs. --- Saving everything with yt-dlp (>>16357, >>12247) >===

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/31/2023 (Wed) 18:55:38.
274 posts and 152 images omitted.
>>36567 Yeah, pretty good. I can't stop laughing. But yeah, generally it's still good of course. Just, especially for those who like big eye brows or coneheads.
Open file (239.26 KB 641x278 Conehead.png)
>>36569 I think there's a market for it
>>36569 >>36572 >tfw ywn a cone-headed waifu... Oh wait! I guess maybe we could **someday soon* if we really wanted to! :D
Open file (114.30 KB 640x225 BOSL2logo.png)
The Belfry OpenScad Library, v2 A library for OpenSCAD, filled with useful tools, shapes, masks, math and manipulators, designed to make OpenSCAD easier to use. Requires OpenSCAD 2021.01 or later. NOTE: BOSL2 IS BETA CODE. THE CODE IS STILL BEING REORGANIZED. NOTE2: CODE WRITTEN FOR BOSLv1 PROBABLY WON'T WORK WITH BOSL2! https://github.com/BelfrySCAD/BOSL2
>>37110 POTD Excellent resource, Kiwi. Thanks! :^)

Report/Delete/Moderation Forms
Delete
Report