/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Site was down because of hosting-related issues. Figuring out why it happened now.

Build Back Better

Sorry for the delays in the BBB plan. An update will be issued in the thread soon in late August. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


When the world says, “Give up,” Hope whispers, “Try it one more time.” -t. Anonymous


Open file (349.32 KB 480x640 0c.png)
Robowaifu Media Propaganda and Merchandizing Anon 01/29/2023 (Sun) 22:15:50 No.19295 [Reply] [Last]
That Time I Incarnated My Christian Cat Girl Maid Wife in a Short Dress Together we can create the robowaifu-verse! Let's start making media to further our cause. There's tremendous potential for us to reach a wide market and prime minds to accept their future cat grill meidos in tiny miniskirts. We have text based stories floating around but, audio and visuals are paramount to gain a general audience. I will start with a short about a young man building a small cat girl and learning to understand her love. Importantly, she will be very limited, in the same way our first models will be. To set certain expectations. Mahoro is impossible but, the slow and simple robowaifus that can exist have their own charm. Let's show the world the love that will soon exist. --- >thread-related Robowaifu Propaganda and Recruitment (>>2705) >=== -add related crosslink
Edited last time by Chobitsu on 02/02/2023 (Thu) 22:18:06.
74 posts and 44 images omitted.
>>33390 I think these lyrics aren't appropriate just yet, too negative, too stressful, too scary. We should stick to positive and calming.
>>33534 Honestly, thx for saying so. Soon after posting, I realised that last part probably wasn't very /robowaifu/ and the singer's attitude changes halfway through, fu-. I must have been more emotionally influenced than I thought - I'll keep that in mind. Just hope the first half wasn't too bad.
>>33534 This. The yuge complexity we're all facing is quite stressful enough all on it's own! :^)
Open file (1.91 MB 918x1200 ChiiByIchiTATa.jpg)
How can an AI be trusted? How could we prove that our AI is trustworthy? How would demonstrate that people should put their trust in the works of our members? What methods would we use to convey this to a wide audience? I still plan to post videos of myself with a prototype MaidCom once she is presentable. We need more, we need widespread belief that our way is safe, trustworthy, and worthy of public use. We also need to prove that Claude, Gemini, and other competing AI are less worthy. We need to open a path where FOSS AI is the mainstream. We need to do so fast, strike before they can regulate us into obscurity. https://www.youtube.com/watch?v=KUkHhVYv3jU
>>34287 Wow! That video was really cool, Kiwi. Thanks for sharing it! :^) <---> As to your question. Again, wow! That's a really, really tough proposition, Anon. BTW You sure this wouldn't be better in the Safety/Security, or Cognitive, or Philosophy, or Ethics/Morals thread(s)? I trust no man (or woman obvs, lol). Not even myself. God alone do I really trust with everything (or so I claim lol; how can one really know until 'everything' is put to the actual test? :^) But that's a matter of my own faith in God's deeds and words (which are innumerable!) So I think the >tl;dr here rightly is: We can't. Probably not the answer you wanted, but it's the one you need -- and it saves me literal years of fruitless searching (just like the fabled King, lol!) <---> OTOH, we have no excuses not to make our works as trustworthy and reliable as we can. OTOOH, this type of goal costs: time, money, effort, intelligence. These things tend to be in short supply in many Anons lives. But all can be either gained or supplied -- except time.

Message too long. Click here to view full text.


SPUD (Specially Programmed UwU Droid) Mechnomancer 11/10/2023 (Fri) 23:18:11 No.26306 [Reply] [Last]
Henlo anons, Stumbled here via youtube rabbit hole & thought I'd share my little side project. Started out as just an elaborate way to do some mech R&D (making a system to generate animation files on windows blender and export/transfer them to a raspberry pi system) and found tinkering with the various python libraries a kinda neat way to pass the time when whether doesn't permit my outside mech work. Plus I'd end up with a booth babe that I don't have to pay or worry about running off with a convention attendee. Currently running voice commands via google speech and chatgpt integration but I'm looking into offline/local stuff like openchat. WEF and such are so desperate to make a totalitarian cyberpunk dystopia I might as well make the fun bits to go along with it. And yes. Chicks do dig giant robots.
497 posts and 277 images omitted.
>>34226 Hmmm James might be onto something here. I'll be back in a while lol. https://www.youtube.com/watch?v=AEXz8xyTC54
>>34226 Those speaker boobs are comically disturbing due to how they recess. Do they have some sort of cover that goes over them? It looks like you made some sort of attachment ring for something of mesh or foam i presume. Would help prevent speaker damage.
>>34254 This is a good example of why I keep pushing the use of LARPfoam for our robowaifu's 'undershells'. LARPagans have a big set of communities around this stuff today, and it's a good idea for us here to benefit from all this information. BTW, another Anon also posted this video here, but I can't locate that post ATM. >I'll be back in a while lol. Lol, don't stay away too long, Mechnomancer. ABTW, this is a daily reminder you'll need a new bread when you get back. Alway rember to link the previous throd in your OP. Cheers, Anon. :^) >>34259 Great find Kiwi, thanks very kindly. I wonder what Bruton has in store for his 'next design that walks better'? Cheers. :^)
New bread: >>34445

Open file (32.62 KB 341x512 unnamed.jpg)
Cyborg general + Biological synthetic brains for robowaifus? Robowaifu Technician 04/06/2020 (Mon) 20:16:19 No.2184 [Reply] [Last]
Scientists made a neural network from rat neurons that could fly a fighter jet in a simulator and control a small robot. I think that lab grown biological components would be a great way to go for some robowaifu systems. It could also make it feel more real. https://www.google.com/amp/s/singularityhub.com/2010/10/06/videos-of-robot-controlled-by-rat-brain-amazing-technology-still-moving-forward/amp/ >=== -add/rm notice
Edited last time by Chobitsu on 08/23/2023 (Wed) 04:40:41.
194 posts and 33 images omitted.
>>34135 A yandere terminator goth girl would be lit
https://physicsworld.com/a/genetically-engineered-bacteria-solve-computational-problems/ >Now a research team from the Saha Institute of Nuclear Physics in India has used genetically modified bacteria to create a cell-based biocomputer with problem-solving capabilities. The researchers created 14 engineered bacterial cells, each of which functioned as a modular and configurable system. They demonstrated that by mixing and matching appropriate modules, the resulting multicellular system could solve nine yes/no computational decision problems and one optimization problem.
>>34134 Very smart, Anon. >Ive been working in factories my whole life so probably industrial/controls engineering Makes sense. Welp, master PID [1], Anon. Then you should look into Behavioral Trees afterwards, to make everything accessible for use by us mere mortals. Cheers, Anon. :^) Keep.Moving.Forward. --- 1. https://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller >>34170 Very intredasting. Thanks, Anon! Cheers. :^)
>>34219 Heres something you should keep an eye on. A human connectome would allow a computer to simulate an entire human brain if it bares fruit. Theyve already done it with a worm and put the simulated brain in robots. A human version would also be theoreti ally possible https://www.humanconnectome.org/
>>34222 >digits There's about a Zeta of synaptic interconnections within the human connectome. AFAICT, we have the most sophisticated -- by far -- neural systems on the planet. So it probably stands to reason that there's much that could be learned by using human neural tissue for such experiments. Thanks for the information, Anon! Cheers. :^)

Papercraft waifu Robowaifu Technician 09/16/2019 (Mon) 06:21:35 No.271 [Reply]
Thoughts on making a paper waifu then adding robotics? I want animu grills but, most robots have uncanny 3DPD faces that aren't nearly as cute as a real waifu. With paper/screens, at least the face can keep the purity and beauty of 2D.
15 posts and 8 images omitted.
>>34070 Have you considered iron-on transfer paper? :)
>>33827 >papercraft wifu shell Just in case you forgot or someone else wants a good source. I found the mother of all paper mache recipe sites by this Grandmother. It's a great resource for making things from paper. She has all sorts of recopies. Some have various additives so you can get better dimensional stability. For prototyping I don't think paper can be beat. Fast, cheap and once you have the shell or a paper positive you can create molds from more solid materials while keeping prototype cost to a minimum, It's at this link, >>33318
>>34113 Thanks kindly, Grommet! My plan ATM is simply to unfold my 3D models from Blender into 2D flats, print & cut them out, then assemble them all together /po/ -style. After coating or two, I can see using some type of papier-mâché coating to fashion a mold perhaps (as you seem to suggest)?
>>34131 That's seems like a super fast way to get results. And yes I do mean using P.M. for molds. I was thinking about doing this for fiberglass and boats. I got the idea from the "concrete fabric formworks" guys. Type that in search and look at some of the images. It's wild what they are doing. Low cost, high quality(most of the water drains out leaving far stronger concrete due to compaction) and they can do structures that the loads are designed in the form to be exactly where they are needed. Due to the curvature of the form it automatically forms the correct reinforcement where needed.
>>34147 >And yes I do mean using P.M. for molds. Yeah, thanks I thought so. >and they can do structures that the loads are designed in the form to be exactly where they are needed. We certainly need to take advantage of similar approaches, for similar needs within our robowaifus. For example, mounting these shell pieces on internal endoskellington struts, etc., could use some beefing up on the attach points. Thanks for all the good ideas, Grommet! Please keep them coming, Anon. Cheers. :^)

Open file (522.71 KB 1920x1080 gen.png)
Nandroid Generator SoaringMoon 02/29/2024 (Thu) 13:54:14 No.30003 [Reply]
I made a generator to generate nandroid images. You can use it in browser, but a desktop version (that should easier to use), will be available. https://soaringmoon.itch.io/nandroid-generator Not very mobile friendly unfortunately, but it does run. I made a post about this already in another thread, but I wanted to make improvements and add features to the software. >If you have any suggestions or ideas other than custom color selection, which I am working on right now, let me know.
21 posts and 10 images omitted.
Open file (7.49 MB 1920x1080 2024-06-06 19-26-31.mp4)
You can now customize eye shadow color.
Nice work SoaringMoon, please keep it up! Cheers. :^)
Open file (79.29 KB 448x736 nandroid (2).png)
>>31460 v0.9 - Sclera color is now customizable.
>>34154 Sweet
>>34154 Glad to see you keeping your project advancing, SoaringMoon. Keep it up! Cheers. :^)

The Sumomo Project Chobitsu Board owner 11/24/2021 (Wed) 17:27:18 No.14409 [Reply] [Last]
So I've been working for a while at devising an integrated approach to help manage some of the software complexity we are surely going to encounter when creating working robowaifus. I went down many different bunny trails and (often) fruitless paths of exploration. In the end I've finally hit on a relatively simplistic approach that AFAICT will actually allow us to both have the great flexibility we'll be needing, and without adding undue overhead and complexity. I call this the RW Foundations library, and I believe it's going to help us all out a lot with creating workable & efficient software that (very hopefully) will allow us to do many things for our robowaifus using only low-end, commodity hardware like the various single-board computers (SBCs) and microcontrollers. Devices like the Beaglebone Blue and Arduino Nano for example. Of course, we likely will also need more powerful machines for some tasks as well. But again, hopefully, the RW Foundations approach will integrate smoothly with that need as well and allow our robowaifus to smoothly interoperate with external computing and other resources. I suppose time will tell. So, to commemorate /robowaifu/'s 5th birthday this weekend, I've prepared a little demonstration project called Sumomo. The near-term goal for the project is simply to create a cute little animated avatar system that allows the characters Sumomo and Kotoko (from the Chobits anime series) to run around having fun and interacting with Anon. But this is also a serious effort, and the intent is to begin fleshing out the real-world robotics needs during the development of this project. Think of it kind of like a kickstarter for real-world robowaifus in the end, but one that's a very gradual effort toward that goal and a little fun along the way. I'll use this thread as a devblog and perhaps also a bit of a debate and training forum for the many issues we all encounter, and how a cute little fairybot/moebot pair can help us all solve a few of them. Anyway, happy birthday /robowaifu/ I love you guys! Here is my little birthday present to you. === >rw_sumomo-v211124.tar.xz.sha256sum 8fceec2958ee75d3c7a33742af134670d0a7349e5da4d83487eb34a2c9f1d4ac *rw_sumomo-v211124.tar.xz >backup drop

Message too long. Click here to view full text.

Edited last time by Chobitsu on 10/22/2022 (Sat) 06:24:09.
159 posts and 97 images omitted.
>>33843 >Blender does a lot of relevant things to support high performance, hard and soft realtime requirements, and heterogeneous development. Not sure what you mean about realtime in Blender's case, but otherwise fair enough. It's a remarkable system today! :^) >Blender's design docs I've seen these in the past, but since I stopped actively building Blender 2-3 years ago, I kind of let it slip my mind. So thanks for the reminder. I personally like Blender's documentation efforts, though I've heard some disagree. Not-uncommonly, this is one of those tasks that get pushed to the 'back burner', and is often left to volunteer work to accomplish. Given the breadth & scope of the platform, I'd say the Blender Foundation has done a yeoman's job at the doco work, overall. Very passable. <---> Also, reading that link reminded me of USD. NVIDIA is currently offering developers their version of free training on this topic, and I've been pondering if I can make the time to attend. A huge amount of the DCC industry has come together to cooperate on Pixar's little baby, and today it's a big, sprawling system. Why it's of interest to us here is that most of what a robowaifu will need to do to analyze and construct models of her 'world' is already accounted for inside this system. While there are plenty of other (often higher-speed) ways to accomplish the same (or nearly the same) tasks, the fact that USD has become such a juggernaut, with a highly-regimented approach to scene descriptions, and with such broad approval, improves the likelihood IMO that other Anons from the film & related industries may in fact be able to help us here once they discover robowaifus in the future -- if we're already using USD to describe her world and the things within it. I hope all that made sense, Anon. https://openusd.org/release/glossary.html# >===

Message too long. Click here to view full text.

Edited last time by Chobitsu on 10/02/2024 (Wed) 17:12:35.
>>33845 >Not sure what you mean about realtime in Blender's case This page looks relevant: https://developer.blender.org/docs/features/cycles/render_scheduling/ Blender does progressive rendering, which starts by rendering low-resolution frames. If there's extra time left over before a frame needs to be rendered, it generates more samples to generate a higher-resolution frame. The equivalent for video generation at a fixed framerate would be running a small number of denoising steps for the next frame, and running additional denoising steps if the next frame doesn't need to be rendered yet. For text generation at a fixed token rate, it would be equivalent to doing speculative decoding for the initial response, then using (maybe progressively) larger models if the next token doesn't need to be output yet. For a cognitive architecture with a fixed response rate, I think the equivalent would be generating an initial response, then continually refining the response based on self-evaluations & feedback from other modules until the the response needs to be output. >USD Very nice. I hadn't heard of this. It looks like a goldmine of information. Your explanation does make sense, and it's a great example of the sort of design patterns that I expect would be useful, in this case for modeling the environment & context.
>>33850 OK good point, CyberPonk. Such UX optimizations can fairly be said to be in the domain of soft-realtime. And certainly, integrating GPU processing code into the system to speed the rendering processes of Cycles & EEVEE has had major positive impacts. I personally think the fact that Ton chose to create the entire GUI for Blender in OpenGL all those years ago has had many far-reaching effects, not the least of which is general responsiveness of the system overall (especially as it has rapidly grown in complexity over the last few years). <---> >It looks like a goldmine of information Glad you like it! It's fairly easy to overlook that describing a scene is in fact a very-complex, nuanced, and -- I'm going to say it -- human undertaking. And when you consider that task from the deeply-technical aspect that USD (and we here) need to accommodate, then you wind up with quite a myriad of seeming-odd-juxtapositions. Until D*sney got their claws into it, Pixar was a one-of-a-kind studio, and well up to such a complicated engineering effort. I doubt they could do it as well today. If at all. DEI DIE to the rescue!111!!ONE! :D Cheers, Anon. :^) >=== -fmt, minor, funpost edit
Edited last time by Chobitsu on 10/03/2024 (Thu) 03:49:23.
>>33857 I looked up that USD. "USD stands for “Universal Scene Description”". I hadn't heard of it. Wow, that's some super comprehensive library and format. Hats off to pixar for open sourcing this.
>>34201 >Hats off to pixar for open sourcing this. Well, it's a vested-interest, but yeah; you're absolutely correct Grommet. Sadly, I'm sure they couldn't even pull it off today; they've become quite afflicted with the incompetency crisis. >protip: competency doesn't cause a crisis, only incompetency does. :^)

Open file (329.39 KB 850x1148 Karasuba.jpg)
Open file (423.49 KB 796x706 YT_AI_news_01.png)
Open file (376.75 KB 804x702 YT_AI_news_02.png)
General Robotics/A.I./Software News, Commentary, + /pol/ Funposting Zone #4 NoidoDev ##eCt7e4 07/19/2023 (Wed) 23:21:28 No.24081 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, and any social or economic issues surrounding it (especially of robowaifus). -previous threads: > #1 (>>404) > #2 (>>16732) > #3 (>>21140)
524 posts and 191 images omitted.
> this thread <insert: TOP KEK> >"There is a tide in the affairs of men, Which taken at the flood, leads on to fortune. Omitted, all the voyage of their life is bound in shallows and in miseries. On such a full sea are we now afloat. And we must take the current when it serves, or lose our ventures." >t. A White man, and no jew...
>>34164 DAILY REMINDER We still need a throd #5 here. Would some kindly soul maybe NoidoDev, Greentext anon, or Kiwi please step up and make one for us all? TIA, Cheers. :^)
Open file (2.43 MB 2285x2962 2541723.png)
>>34230 Guess it's up to me again. This was much easier than the meta thread. Took me like fifteen minutes, and ten of those were spent browsing in my image folders for the first two pics. Changes are as follows: + New cover pic + Added poner pic + New articles ~ Minor alteration to formatting >>34233
>>34234 >Guess it's up to me again. Thanks, Greentext anon! Cheers. :^)
>>34234 NEW THREAD NEW THREAD NEW THREAD >>34233 >>34233 >>34233 >>34233 >>34233 NEW THREAD NEW THREAD NEW THREAD

Open file (14.96 KB 280x280 wfu.jpg)
Beginners guide to AI, ML, DL. Beginner Anon 11/10/2020 (Tue) 07:12:47 No.6560 [Reply] [Last]
I already know we have a thread dedicated to books,videos,tutorials etc. But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. That is why I thought we would need a thread like this. Assuming that I only have basic programming in python, dedication, love for robowaifus but no maths, no statistics, no physics, no college education how can I get advanced enough to create AI waifus? I need a complete pathway directing me to my aim. I've seen that some of you guys recommended books about reinforcement learning and some general books but can I really learn enough by just reading them? AI is a huge field so it's pretty easy to get lost. What I did so far was to buy a non-english great book about AI, philosophycal discussions of it, general algorithms, problem solving techniques, history of it, limitations, gaming theories... But it's not a technical book. Because of that I also bought a few courses on this website called Udemy. They are about either Machine Learning or Deep Learning. I am hoping to learn basic algorithms through those books but because I don't have maths it is sometimes hard to understand the concept. For example even when learning linear regression, it is easy to use a python library but can't understand how it exactly works because of the lack of Calculus I have. Because of that issue I have hard time understanding algorithms. >>5818 >>6550 Can those anons please help me? Which resources should I use in order to be able to produce robowaifus? If possible, you can even create a list of books/courses I need to follow one by one to be able to achieve that aim of mine. If not, I can send you the resources I got and you can help me to put those in an order. I also need some guide about maths as you can tell. Yesterday after deciding and promising myself that I will give whatever it takes to build robowaifus I bought 3 courses about linear alg, calculus, stats but I'm not really good at them. I am waiting for your answers anons, thanks a lot!
74 posts and 106 images omitted.
>>33765 > But I could see training one to recognize voice, one to deal with movement, one to deal with vision, specifically not running into things, and maybe one for instruction on a low level. Like move here, pick up this, etc If I remember right Mark Tilden referred to this as "horse and rider" setup, where you have a high level program giving direction to a lower level program. The lower level worries about not stepping in a hole, etc while the high level worries about where the pair are going. I too have experienced the boons of separating different functions into different programs/AI. To give a real life example of what you're talking about: my voice recognition AI doesn't like to run in the same program as the image recognition AI. I've experienced some programs running at different speeds, eg: on a raspi takes half a second for the image recognition to run, while the servo program can run like 2 dozen times a second while it is running, and the voice detection pauses the program until words are heard (or a 5 second timeout), so these different speeds/natures of the code requires separation, which in turn requires developing a way to communicate with each each program. >>33746 >Starting Best way to start is looking for a code/library that does what you want (like image recognition), and try tweaking it to fit your needs, like making it interact with other programs eg if an object is recognized in an image, move a servo.
>>33767 >I've experienced some programs running at different speeds Asynchrony is a deep topic in systems engineering for complex 'systems-of-systems' -- which full-blown robowaifus will certainly be in the end. The buffering and test, test, test combo has been the most successful engineering approach to this issue thus far, AFAICT. Just imagine the timing difficulties that had to be surmounted by the men who created and operated the Apollo spacecraft out to the Moon & back! Lol, our problems here are actually much more complicated (by at least a couple orders magnitude)!! :DD Kiwi discussed the desire that >A thread dedicated to man machine relationships may be needed : ( >>33634 ), and I agree. While my guess is that he meant for that thread to primarily be focused on the psychological/social aspects of those relationships, I would argue that the engineering of complex parts of our robowaifu's systems that in any way involve responsiveness or interaction-timing with her Master (or others) is definitely fair game for such a thread. The reason is simple: timing of interactions -- particularly verbal ones -- clearly affects the social perceptions of those (for all parties involved). >tl;dr < If it takes our robowaifus more than half a second to begin to respond to her Master's engagements, then we've waited too long... Cheers. :^) >=== -fmt, prose edit
Edited last time by Chobitsu on 09/28/2024 (Sat) 03:08:07.
>>33905 Thanks, Anon. Nice degree programs. And at least one version of many of these lectures are available online! Cheers. :^)
>>33767 Thanks for the advice. It's welcome.

Prototypes and Failures #4 Noido Dev ##pTGTWW 01/23/2024 (Tue) 03:17:26 No.28715 [Reply] [Last]
Post your prototypes and failures. We fail until we win. Don't forget looking through the old threads here >>418, >>18800 and >>21647 to understand how we got to where we are now.
49 posts and 47 images omitted.
>>33727 Yes. Print a series of interlocking tabs/flats (for tabs) on the backs of each panel. Print a base ring around her waist (for support) that the first row of interlocked panels go down into. Then progressively build up the next row of interlocked panels, etc. Make sense Kiwi?
>>33728 Could you provide some visual to go with your idea? What I imagine based on your response would still have the issue of the TPU parts stretching out of the connectors.
>>33730 >What I imagine based on your response would still have the issue of the TPU parts stretching out of the connectors. Yes, that would be an issue. I'd suggest experimenting with some non-panel test strips of tabs/flats, with a running series of matched sets that get progressively longer. Then you can just fiddle with them by hand to get a feel of what length will stay secure. Also, you might decide that each flat could use a leading 'guide slot' (kind of like a belt going through it's attached loop to stay snug) the tab could slide down into to keep it nicely secured into that matching flat. Also, you might decide to have a series of rings running up the (torso, say) to provide supports for the rows of panels both above and below. Kind of like Victorian hoop dresses were. As for the basic concept, there are lots of storage bins that have a series of interlocking tabs/flats on the two lid halves. When you close the two halves of the lid, they interlock together (one tab to one flat) and it provides passive rigidity. Also I can't post images while Robi has file posting blocked for Torfags (this is also why I can't bake new breads rn). Sorry. :P >=== -add 'guide slot' cmnt
Edited last time by Chobitsu on 09/22/2024 (Sun) 07:26:20.
Open file (95.99 KB 511x599 skullT01v01e_002.png)
Open file (46.40 KB 511x599 skullT01v01e_001.png)
Open file (104.62 KB 511x599 skullT01v01e_003.png)
>>32794 Managed to do a bit of more work on the skull design. The jaw is shown with cylinders for holding teeth, but this could be switched out with simple teeth in the design, or holes for holding other ones. It can relatively easily be changed. Though, changing the number or anything is still a bit more difficult than I'd like. Whatever, that's the case for many things. I always hope I can make something that I can change by some variables, but it turns out to be more complex with interdependencies.
>>34096 Keep up the great work Anon! Cheers. :^)

Bipedal Robot Locomotion General Robowaifu Technician 09/15/2019 (Sun) 05:57:42 No.237 [Reply] [Last]
We need to talk about bipedal locomotion. It's a complicated topic but one that has to be solved if we are ever to have satisfyingly believable robowaifus. There has surely already been a lot of research done on this topic, and we need to start digging and find the info that's out there. There are some projects that have at least partial robolegs solutions working, but none that I know of that look very realistic yet. We likely won't come up with some master-stroke of genius and solve everyone's problems here on /robowaifu/, but we should at least take a whack at it who knows? We certainly can't accomplish anything if we don't try.

I personally believe we should be keeping the weight out of the extremities – including the legs – while other anons think that we should add weight to the feet for balance. What's you're ideas anon? How do we control the gait? How do we adjust for different conditions? What if our robowaifu is carrying things? What about the legs during sex? Should we focus on the maths behind MIP (Mobile Inverted Pendulum), or is there a different approach that would be more straightforward? A mixture? Maybe we can even do weird stuff like reverse-knee legs that so many animals have. Robofaun waifu anyone? What about having something like heelys or bigger wheels in the feet as well?

I'm pretty sure if we just put our heads together and don't stop trying, we'll eventually arrive at least one good general solution to the problem of creating bipedal robot legs.

>tl;dr
ITT post good robowaifu legs

>tech diagrams sauce
www.youtube.com/watch?v=pgaEE27nsQw
www.goatstream.com/research/papers/SA2013/SA2013.pdf
185 posts and 73 images omitted.
>>33969 >muh le 2000lb motor >Lolwut? A 2000lb winch motor, each motor pulls 15A for a total of 180 watts.
>>33972 Oh, OK got it. Heh, I thought you were saying the motor weighed a ton. :^)
Thanks for all the YouTube links, but please add some some description when posting. Ideally more than one line. These videos have a text description on YouTube which can be copied. With ">" you can quote it, though make sure to put this in front of each paragraph when doing so. Cheers.

Report/Delete/Moderation Forms
Delete
Report