/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


Open file (93.53 KB 800x540 TypesOfMotors.jpg)
Open file (436.87 KB 1664x2048 MotionBaseServos.jpeg)
Open file (78.13 KB 922x1396 Femisapien.jpg)
Open file (2.25 MB 2500x1778 MechaMadoka.jpg)
Actuators For Waifu Movement Part 3 Kiwi 12/06/2023 (Wed) 01:18:16 No.27021 [Reply] [Last]
(1stl thread >>406 2nd thread >>12810) Kiwi back again with a thread for discussing actuators to move your waifu! Part Three! Let's start with a quick introduction to common actuators! 1. DC motors, these use brushes to switch the ferrous core electromagnets on a rotor to rotate its magnetic field relative to surrounding magnets! They're one of the cheapest options with an average efficiency range of 30 to 90%. Larger DC motors and motors with higher turn counts are more efficient. 1.5 Coreless DC motors, by removing ferrous materials, losses from hysteresis are almost eliminated, dramatically increasing efficiency to nearly 90% even in small motors. Eliminating the ferrous materials reduces flux focusing, resulting in weaker fields and higher speeds. 2. Brushless DC motors (BLDC), these use a controller to switch the electromagnets on a stator to rotate the magnets of a rotor! Without brushes, they have the potential to be more efficient with higher power density compared to DC motors. Their efficiency and behavior vary depending on the algorithm and sensors used to control them. Coreless brushless motors exist but are rare and only used for very niche applications. 3. AC motors, a wide and incredibly varied category. They all rely on AC’s frequency to control them. With single phase AC motors relying on shaded poles, capacitors, or some other method to induce a rotating magnetic field. 3 phase AC motors naturally have a rotating field which usually gives them higher efficiency and power density. Notably, most AC motors are brushless. The most commonly used brushed AC motor is the universal motor, which is 4. Stepper motors, brushless motors with ferrous teeth to focus magnetic flux. This allows for incredible control (stepping) at the cost of greater mass, subsequently giving them higher rotary inertia. Usually 50 to 80% efficient depending on control algorithm/speed/and quality of the stepper. Due to their increasing mass production (& ubiquitous low cost controllers), they have appeal as a lower cost alternative to BLDC motors if one carefully designs around them. 5. Coiled Nylon Actuators! These things have an efficiency rating so low it's best to just say they aren't efficient. (0.01% typical, 2% achieved under extremely specific conditions in a lab.) Though they are exciting due to their incredible low cost of fabrication, they’re far too slow and the energy requirements are nonsensical. https://youtu.be/S4-3_DnKE9E https://youtu.be/wltLEzQnznM 6. Hydraulics! These rely on the distribution of pressure in a working liquid to move things like pistons. Though popular in large scale industry, their ability to be used in waifu's has yet to be proven. (Boston Dynamics Atlas runs on hydraulics but it's a power guzzler and heavy) Efficiency varies wildly depending on implementation. They would work great for a giantess! 7. Pneumatics, hydraulics lighter sister! This time the fluid is air! This has the advantage in weight. They aren't capable of the same power loads hydraulics are but, who wants their waifu to bench press a car? (Too loud and inefficient for mobile robotics.) 8. Wax motors, hydraulic systems where the working fluid is expanding melted (commonly paraffin) wax! Cheap, low power, and produce incredible forces! Too bad they're slow and hard to control. 9. Explosion! Yes, you can move things through explosions! Gas engines work through explosions! Artificial muscles can be made by exploding a hydrogen and oxygen mixture in a piston, then using hydrolysis to turn the water back into hydrogen and oxygen. None of this is efficient or practical but it's vital we keep our minds open! Though there are more actuators, most are derivatives or use these examples to work. Things like pulleys need an actuator to move them. Now, let's share, learn, and get our waifu moving!

Message too long. Click here to view full text.

Edited last time by Chobitsu on 12/06/2023 (Wed) 03:06:55.
183 posts and 54 images omitted.
>>34152 >Thanks for working on this. >I hope you make this into modules which can be easily adapted to something. I could also imagine that I'd like to run the code for the actual movement decentralized on Arduinos. Y/w. I hope to expose every user-directable function as a pythonic API, suitable for scripting from, say, CyberPonk's Horsona library. Even Bash scripting isn't off the table ATP. >The realization and the command to do a set of movements, the motion planning, and the execution of the commands to the servo might be on different hardware (SBCs and controllers). Yes, very much so. Anon's IPCnet is the backbone to allow such distributed, decentralized computing to occur within our robowaifus. >>34167 >I might draw and scan something tomorrow Hope to see it soon, NoidoDev! Cheers. :^) >>34171 POTD This is growing into something important to us here, IMO. Keep it up, Anon!
Hear my schizo post, Anons. I think I have a good idea. We should use this train of connections to form an electric muscle. <Bone tether => Actuator Housing => Generic electric motor => magnetic sun-gear reduction => spool => tensile cable => bone tether. Don't forget a reset spring that . Turning on the motor puts torque through a double-acting gear-reducer, and clutch to wind up a spool, to pull a fiber, to pull the muscle tethers. Why this train? First off, you'll need to see this: https://www.youtube.com/watch?v=eaMD_9kOlTA&t=154s This video is the work of a genius on a magnetic gear train. Why do you care? Because... >We can swap out one of the magnetic gearwheels with *electromagnets*. >Electromagnets will vary strength based on the current running through them, and this allows us to essentially have a *frictionless clutch* between the electric motor, and the muscle fiber. A frictionless clutch! >The sun-gear set is concentric! We can fit them all into a tubular housing with a skinny aspect ratio! This takes a ton of complexity out of controlling the actuator. It will act like a muscle. You turn on the motor and activate the clutch, the muscle pulls. Turn up the power of the motor/clutch, it pulls harder. When you're done, it relaxes and resets to the spring settings. If the muscle is strained harder than it pulls, the clutch simply slips! It's cuddle-safe!

Message too long. Click here to view full text.

>>34161 I meant that passive dynamic research has provided different mechanisms, algorithms, and equations which would benefit your designs in general. What works for walking, works for anything else that relies on dynamics. We all need to keep an open mind for using previous research in novel ways. For instance, you're working with breaks, and passive dynamic research involves using breaks to optimize energy use with gravity.
>>34319 POTD This is great stuff, Anon. I'd suggest you also have a look at the linear actuators for the Juggler Bot Anon's designs. They are fairly closely related to your approach as well, I think. I hope & pray you'll actually get the resources together to make these actuators real, and then share your experiences here with everyone. Please keep the great ideas coming, Anon! Cheers. :^)
> (topics-related: >>34509, >>34550 )

Waifus in society Robowaifu Technician 09/11/2019 (Wed) 02:02:53 No.106 [Reply] [Last]
Would you walk around with your waifu? Would you hold her in public? Would you shamelessly have her custody with you to conventions? Would you take her on dates? This thread is for discussing how you'd interact with your waifu outside of the home.
135 posts and 56 images omitted.
>>34304 >First Post Best Post That is not the first post. >>34310 That's why I'm a little conflicted I like the idea of my waifu being able to defend herself, but her being too strong could also cause potential problems.
>>34310 While it is a possibility, the clownhaired rabid feminist types are more rare than the internet would lead you to believe (part of their strength is the illusion of it). If your robowaifu is helpful beyond companionship (eg can help you carry stuff) that would make it more socially acceptable for normies... but even normies won't mind if it is a robot with some neat, fun features. When I've seen public robots, folks tend to leave them alone when they're accompanied by humans and respect the bot. Just try to avoid mentioning whether or not it is fully functional and anatomically correct ;) eg normie: "does your robowaifu have genitals?" robosexual: "do you?
>>34310 Its not the women I'm concerned about but their feral simp hordes they'll get to do their dirty bidding
>>34330 Well in that case, just defend your property. There's no stigma against hitting another man.
>>34322 >That is not the first post. Actually, it was at the time that Anon made that post. Between that time and when you read it, I merged that whole thread into this one (which then became the last few posts ITT). This is a mundane example of so-called temporal sliding. >tl;dr You're both right! :^)

Open file (349.32 KB 480x640 0c.png)
Robowaifu Media Propaganda and Merchandizing Anon 01/29/2023 (Sun) 22:15:50 No.19295 [Reply] [Last]
That Time I Incarnated My Christian Cat Girl Maid Wife in a Short Dress Together we can create the robowaifu-verse! Let's start making media to further our cause. There's tremendous potential for us to reach a wide market and prime minds to accept their future cat grill meidos in tiny miniskirts. We have text based stories floating around but, audio and visuals are paramount to gain a general audience. I will start with a short about a young man building a small cat girl and learning to understand her love. Importantly, she will be very limited, in the same way our first models will be. To set certain expectations. Mahoro is impossible but, the slow and simple robowaifus that can exist have their own charm. Let's show the world the love that will soon exist. --- >thread-related Robowaifu Propaganda and Recruitment (>>2705) >=== -add related crosslink
Edited last time by Chobitsu on 02/02/2023 (Thu) 22:18:06.
74 posts and 44 images omitted.
>>33390 I think these lyrics aren't appropriate just yet, too negative, too stressful, too scary. We should stick to positive and calming.
>>33534 Honestly, thx for saying so. Soon after posting, I realised that last part probably wasn't very /robowaifu/ and the singer's attitude changes halfway through, fu-. I must have been more emotionally influenced than I thought - I'll keep that in mind. Just hope the first half wasn't too bad.
>>33534 This. The yuge complexity we're all facing is quite stressful enough all on it's own! :^)
Open file (1.91 MB 918x1200 ChiiByIchiTATa.jpg)
How can an AI be trusted? How could we prove that our AI is trustworthy? How would demonstrate that people should put their trust in the works of our members? What methods would we use to convey this to a wide audience? I still plan to post videos of myself with a prototype MaidCom once she is presentable. We need more, we need widespread belief that our way is safe, trustworthy, and worthy of public use. We also need to prove that Claude, Gemini, and other competing AI are less worthy. We need to open a path where FOSS AI is the mainstream. We need to do so fast, strike before they can regulate us into obscurity. https://www.youtube.com/watch?v=KUkHhVYv3jU
>>34287 Wow! That video was really cool, Kiwi. Thanks for sharing it! :^) <---> As to your question. Again, wow! That's a really, really tough proposition, Anon. BTW You sure this wouldn't be better in the Safety/Security, or Cognitive, or Philosophy, or Ethics/Morals thread(s)? I trust no man (or woman obvs, lol). Not even myself. God alone do I really trust with everything (or so I claim lol; how can one really know until 'everything' is put to the actual test? :^) But that's a matter of my own faith in God's deeds and words (which are innumerable!) So I think the >tl;dr here rightly is: We can't. Probably not the answer you wanted, but it's the one you need -- and it saves me literal years of fruitless searching (just like the fabled King, lol!) <---> OTOH, we have no excuses not to make our works as trustworthy and reliable as we can. OTOOH, this type of goal costs: time, money, effort, intelligence. These things tend to be in short supply in many Anons lives. But all can be either gained or supplied -- except time.

Message too long. Click here to view full text.


SPUD (Specially Programmed UwU Droid) Mechnomancer 11/10/2023 (Fri) 23:18:11 No.26306 [Reply] [Last]
Henlo anons, Stumbled here via youtube rabbit hole & thought I'd share my little side project. Started out as just an elaborate way to do some mech R&D (making a system to generate animation files on windows blender and export/transfer them to a raspberry pi system) and found tinkering with the various python libraries a kinda neat way to pass the time when whether doesn't permit my outside mech work. Plus I'd end up with a booth babe that I don't have to pay or worry about running off with a convention attendee. Currently running voice commands via google speech and chatgpt integration but I'm looking into offline/local stuff like openchat. WEF and such are so desperate to make a totalitarian cyberpunk dystopia I might as well make the fun bits to go along with it. And yes. Chicks do dig giant robots.
497 posts and 277 images omitted.
>>34226 Hmmm James might be onto something here. I'll be back in a while lol. https://www.youtube.com/watch?v=AEXz8xyTC54
>>34226 Those speaker boobs are comically disturbing due to how they recess. Do they have some sort of cover that goes over them? It looks like you made some sort of attachment ring for something of mesh or foam i presume. Would help prevent speaker damage.
>>34254 This is a good example of why I keep pushing the use of LARPfoam for our robowaifu's 'undershells'. LARPagans have a big set of communities around this stuff today, and it's a good idea for us here to benefit from all this information. BTW, another Anon also posted this video here, but I can't locate that post ATM. >I'll be back in a while lol. Lol, don't stay away too long, Mechnomancer. ABTW, this is a daily reminder you'll need a new bread when you get back. Alway rember to link the previous throd in your OP. Cheers, Anon. :^) >>34259 Great find Kiwi, thanks very kindly. I wonder what Bruton has in store for his 'next design that walks better'? Cheers. :^)
New bread: >>34445

Open file (32.62 KB 341x512 unnamed.jpg)
Cyborg general + Biological synthetic brains for robowaifus? Robowaifu Technician 04/06/2020 (Mon) 20:16:19 No.2184 [Reply] [Last]
Scientists made a neural network from rat neurons that could fly a fighter jet in a simulator and control a small robot. I think that lab grown biological components would be a great way to go for some robowaifu systems. It could also make it feel more real. https://www.google.com/amp/s/singularityhub.com/2010/10/06/videos-of-robot-controlled-by-rat-brain-amazing-technology-still-moving-forward/amp/ >=== -add/rm notice
Edited last time by Chobitsu on 08/23/2023 (Wed) 04:40:41.
195 posts and 33 images omitted.
https://physicsworld.com/a/genetically-engineered-bacteria-solve-computational-problems/ >Now a research team from the Saha Institute of Nuclear Physics in India has used genetically modified bacteria to create a cell-based biocomputer with problem-solving capabilities. The researchers created 14 engineered bacterial cells, each of which functioned as a modular and configurable system. They demonstrated that by mixing and matching appropriate modules, the resulting multicellular system could solve nine yes/no computational decision problems and one optimization problem.
>>34134 Very smart, Anon. >Ive been working in factories my whole life so probably industrial/controls engineering Makes sense. Welp, master PID [1], Anon. Then you should look into Behavioral Trees afterwards, to make everything accessible for use by us mere mortals. Cheers, Anon. :^) Keep.Moving.Forward. --- 1. https://en.wikipedia.org/wiki/Proportional%E2%80%93integral%E2%80%93derivative_controller >>34170 Very intredasting. Thanks, Anon! Cheers. :^)
>>34219 Heres something you should keep an eye on. A human connectome would allow a computer to simulate an entire human brain if it bares fruit. Theyve already done it with a worm and put the simulated brain in robots. A human version would also be theoreti ally possible https://www.humanconnectome.org/
>>34222 >digits There's about a Zeta of synaptic interconnections within the human connectome. AFAICT, we have the most sophisticated -- by far -- neural systems on the planet. So it probably stands to reason that there's much that could be learned by using human neural tissue for such experiments. Thanks for the information, Anon! Cheers. :^)
> keratin based materials as a skin >>34724

The Sumomo Project Chobitsu Board owner 11/24/2021 (Wed) 17:27:18 No.14409 [Reply] [Last]
So I've been working for a while at devising an integrated approach to help manage some of the software complexity we are surely going to encounter when creating working robowaifus. I went down many different bunny trails and (often) fruitless paths of exploration. In the end I've finally hit on a relatively simplistic approach that AFAICT will actually allow us to both have the great flexibility we'll be needing, and without adding undue overhead and complexity. I call this the RW Foundations library, and I believe it's going to help us all out a lot with creating workable & efficient software that (very hopefully) will allow us to do many things for our robowaifus using only low-end, commodity hardware like the various single-board computers (SBCs) and microcontrollers. Devices like the Beaglebone Blue and Arduino Nano for example. Of course, we likely will also need more powerful machines for some tasks as well. But again, hopefully, the RW Foundations approach will integrate smoothly with that need as well and allow our robowaifus to smoothly interoperate with external computing and other resources. I suppose time will tell. So, to commemorate /robowaifu/'s 5th birthday this weekend, I've prepared a little demonstration project called Sumomo. The near-term goal for the project is simply to create a cute little animated avatar system that allows the characters Sumomo and Kotoko (from the Chobits anime series) to run around having fun and interacting with Anon. But this is also a serious effort, and the intent is to begin fleshing out the real-world robotics needs during the development of this project. Think of it kind of like a kickstarter for real-world robowaifus in the end, but one that's a very gradual effort toward that goal and a little fun along the way. I'll use this thread as a devblog and perhaps also a bit of a debate and training forum for the many issues we all encounter, and how a cute little fairybot/moebot pair can help us all solve a few of them. Anyway, happy birthday /robowaifu/ I love you guys! Here is my little birthday present to you. === >rw_sumomo-v211124.tar.xz.sha256sum 8fceec2958ee75d3c7a33742af134670d0a7349e5da4d83487eb34a2c9f1d4ac *rw_sumomo-v211124.tar.xz >backup drop

Message too long. Click here to view full text.

Edited last time by Chobitsu on 10/22/2022 (Sat) 06:24:09.
159 posts and 97 images omitted.
>>33843 >Blender does a lot of relevant things to support high performance, hard and soft realtime requirements, and heterogeneous development. Not sure what you mean about realtime in Blender's case, but otherwise fair enough. It's a remarkable system today! :^) >Blender's design docs I've seen these in the past, but since I stopped actively building Blender 2-3 years ago, I kind of let it slip my mind. So thanks for the reminder. I personally like Blender's documentation efforts, though I've heard some disagree. Not-uncommonly, this is one of those tasks that get pushed to the 'back burner', and is often left to volunteer work to accomplish. Given the breadth & scope of the platform, I'd say the Blender Foundation has done a yeoman's job at the doco work, overall. Very passable. <---> Also, reading that link reminded me of USD. NVIDIA is currently offering developers their version of free training on this topic, and I've been pondering if I can make the time to attend. A huge amount of the DCC industry has come together to cooperate on Pixar's little baby, and today it's a big, sprawling system. Why it's of interest to us here is that most of what a robowaifu will need to do to analyze and construct models of her 'world' is already accounted for inside this system. While there are plenty of other (often higher-speed) ways to accomplish the same (or nearly the same) tasks, the fact that USD has become such a juggernaut, with a highly-regimented approach to scene descriptions, and with such broad approval, improves the likelihood IMO that other Anons from the film & related industries may in fact be able to help us here once they discover robowaifus in the future -- if we're already using USD to describe her world and the things within it. I hope all that made sense, Anon. https://openusd.org/release/glossary.html# >===

Message too long. Click here to view full text.

Edited last time by Chobitsu on 10/02/2024 (Wed) 17:12:35.
>>33845 >Not sure what you mean about realtime in Blender's case This page looks relevant: https://developer.blender.org/docs/features/cycles/render_scheduling/ Blender does progressive rendering, which starts by rendering low-resolution frames. If there's extra time left over before a frame needs to be rendered, it generates more samples to generate a higher-resolution frame. The equivalent for video generation at a fixed framerate would be running a small number of denoising steps for the next frame, and running additional denoising steps if the next frame doesn't need to be rendered yet. For text generation at a fixed token rate, it would be equivalent to doing speculative decoding for the initial response, then using (maybe progressively) larger models if the next token doesn't need to be output yet. For a cognitive architecture with a fixed response rate, I think the equivalent would be generating an initial response, then continually refining the response based on self-evaluations & feedback from other modules until the the response needs to be output. >USD Very nice. I hadn't heard of this. It looks like a goldmine of information. Your explanation does make sense, and it's a great example of the sort of design patterns that I expect would be useful, in this case for modeling the environment & context.
>>33850 OK good point, CyberPonk. Such UX optimizations can fairly be said to be in the domain of soft-realtime. And certainly, integrating GPU processing code into the system to speed the rendering processes of Cycles & EEVEE has had major positive impacts. I personally think the fact that Ton chose to create the entire GUI for Blender in OpenGL all those years ago has had many far-reaching effects, not the least of which is general responsiveness of the system overall (especially as it has rapidly grown in complexity over the last few years). <---> >It looks like a goldmine of information Glad you like it! It's fairly easy to overlook that describing a scene is in fact a very-complex, nuanced, and -- I'm going to say it -- human undertaking. And when you consider that task from the deeply-technical aspect that USD (and we here) need to accommodate, then you wind up with quite a myriad of seeming-odd-juxtapositions. Until D*sney got their claws into it, Pixar was a one-of-a-kind studio, and well up to such a complicated engineering effort. I doubt they could do it as well today. If at all. DEI DIE to the rescue!111!!ONE! :D Cheers, Anon. :^) >=== -fmt, minor, funpost edit
Edited last time by Chobitsu on 10/03/2024 (Thu) 03:49:23.
>>33857 I looked up that USD. "USD stands for “Universal Scene Description”". I hadn't heard of it. Wow, that's some super comprehensive library and format. Hats off to pixar for open sourcing this.
>>34201 >Hats off to pixar for open sourcing this. Well, it's a vested-interest, but yeah; you're absolutely correct Grommet. Sadly, I'm sure they couldn't even pull it off today; they've become quite afflicted with the incompetency crisis. >protip: competency doesn't cause a crisis, only incompetency does. :^)

Open file (329.39 KB 850x1148 Karasuba.jpg)
Open file (423.49 KB 796x706 YT_AI_news_01.png)
Open file (376.75 KB 804x702 YT_AI_news_02.png)
General Robotics/A.I./Software News, Commentary, + /pol/ Funposting Zone #4 NoidoDev ##eCt7e4 07/19/2023 (Wed) 23:21:28 No.24081 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, and any social or economic issues surrounding it (especially of robowaifus). -previous threads: > #1 (>>404) > #2 (>>16732) > #3 (>>21140)
524 posts and 191 images omitted.
> this thread <insert: TOP KEK> >"There is a tide in the affairs of men, Which taken at the flood, leads on to fortune. Omitted, all the voyage of their life is bound in shallows and in miseries. On such a full sea are we now afloat. And we must take the current when it serves, or lose our ventures." >t. A White man, and no jew...
>>34164 DAILY REMINDER We still need a throd #5 here. Would some kindly soul maybe NoidoDev, Greentext anon, or Kiwi please step up and make one for us all? TIA, Cheers. :^)
Open file (2.43 MB 2285x2962 2541723.png)
>>34230 Guess it's up to me again. This was much easier than the meta thread. Took me like fifteen minutes, and ten of those were spent browsing in my image folders for the first two pics. Changes are as follows: + New cover pic + Added poner pic + New articles ~ Minor alteration to formatting >>34233
>>34234 >Guess it's up to me again. Thanks, Greentext anon! Cheers. :^)
>>34234 NEW THREAD NEW THREAD NEW THREAD >>34233 >>34233 >>34233 >>34233 >>34233 NEW THREAD NEW THREAD NEW THREAD

Open file (14.96 KB 280x280 wfu.jpg)
Beginners guide to AI, ML, DL. Beginner Anon 11/10/2020 (Tue) 07:12:47 No.6560 [Reply] [Last]
I already know we have a thread dedicated to books,videos,tutorials etc. But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. That is why I thought we would need a thread like this. Assuming that I only have basic programming in python, dedication, love for robowaifus but no maths, no statistics, no physics, no college education how can I get advanced enough to create AI waifus? I need a complete pathway directing me to my aim. I've seen that some of you guys recommended books about reinforcement learning and some general books but can I really learn enough by just reading them? AI is a huge field so it's pretty easy to get lost. What I did so far was to buy a non-english great book about AI, philosophycal discussions of it, general algorithms, problem solving techniques, history of it, limitations, gaming theories... But it's not a technical book. Because of that I also bought a few courses on this website called Udemy. They are about either Machine Learning or Deep Learning. I am hoping to learn basic algorithms through those books but because I don't have maths it is sometimes hard to understand the concept. For example even when learning linear regression, it is easy to use a python library but can't understand how it exactly works because of the lack of Calculus I have. Because of that issue I have hard time understanding algorithms. >>5818 >>6550 Can those anons please help me? Which resources should I use in order to be able to produce robowaifus? If possible, you can even create a list of books/courses I need to follow one by one to be able to achieve that aim of mine. If not, I can send you the resources I got and you can help me to put those in an order. I also need some guide about maths as you can tell. Yesterday after deciding and promising myself that I will give whatever it takes to build robowaifus I bought 3 courses about linear alg, calculus, stats but I'm not really good at them. I am waiting for your answers anons, thanks a lot!
74 posts and 106 images omitted.
>>33765 > But I could see training one to recognize voice, one to deal with movement, one to deal with vision, specifically not running into things, and maybe one for instruction on a low level. Like move here, pick up this, etc If I remember right Mark Tilden referred to this as "horse and rider" setup, where you have a high level program giving direction to a lower level program. The lower level worries about not stepping in a hole, etc while the high level worries about where the pair are going. I too have experienced the boons of separating different functions into different programs/AI. To give a real life example of what you're talking about: my voice recognition AI doesn't like to run in the same program as the image recognition AI. I've experienced some programs running at different speeds, eg: on a raspi takes half a second for the image recognition to run, while the servo program can run like 2 dozen times a second while it is running, and the voice detection pauses the program until words are heard (or a 5 second timeout), so these different speeds/natures of the code requires separation, which in turn requires developing a way to communicate with each each program. >>33746 >Starting Best way to start is looking for a code/library that does what you want (like image recognition), and try tweaking it to fit your needs, like making it interact with other programs eg if an object is recognized in an image, move a servo.
>>33767 >I've experienced some programs running at different speeds Asynchrony is a deep topic in systems engineering for complex 'systems-of-systems' -- which full-blown robowaifus will certainly be in the end. The buffering and test, test, test combo has been the most successful engineering approach to this issue thus far, AFAICT. Just imagine the timing difficulties that had to be surmounted by the men who created and operated the Apollo spacecraft out to the Moon & back! Lol, our problems here are actually much more complicated (by at least a couple orders magnitude)!! :DD Kiwi discussed the desire that >A thread dedicated to man machine relationships may be needed : ( >>33634 ), and I agree. While my guess is that he meant for that thread to primarily be focused on the psychological/social aspects of those relationships, I would argue that the engineering of complex parts of our robowaifu's systems that in any way involve responsiveness or interaction-timing with her Master (or others) is definitely fair game for such a thread. The reason is simple: timing of interactions -- particularly verbal ones -- clearly affects the social perceptions of those (for all parties involved). >tl;dr < If it takes our robowaifus more than half a second to begin to respond to her Master's engagements, then we've waited too long... Cheers. :^) >=== -fmt, prose edit
Edited last time by Chobitsu on 09/28/2024 (Sat) 03:08:07.
>>33905 Thanks, Anon. Nice degree programs. And at least one version of many of these lectures are available online! Cheers. :^)
>>33767 Thanks for the advice. It's welcome.

Bipedal Robot Locomotion General Robowaifu Technician 09/15/2019 (Sun) 05:57:42 No.237 [Reply] [Last]
We need to talk about bipedal locomotion. It's a complicated topic but one that has to be solved if we are ever to have satisfyingly believable robowaifus. There has surely already been a lot of research done on this topic, and we need to start digging and find the info that's out there. There are some projects that have at least partial robolegs solutions working, but none that I know of that look very realistic yet. We likely won't come up with some master-stroke of genius and solve everyone's problems here on /robowaifu/, but we should at least take a whack at it who knows? We certainly can't accomplish anything if we don't try.

I personally believe we should be keeping the weight out of the extremities – including the legs – while other anons think that we should add weight to the feet for balance. What's you're ideas anon? How do we control the gait? How do we adjust for different conditions? What if our robowaifu is carrying things? What about the legs during sex? Should we focus on the maths behind MIP (Mobile Inverted Pendulum), or is there a different approach that would be more straightforward? A mixture? Maybe we can even do weird stuff like reverse-knee legs that so many animals have. Robofaun waifu anyone? What about having something like heelys or bigger wheels in the feet as well?

I'm pretty sure if we just put our heads together and don't stop trying, we'll eventually arrive at least one good general solution to the problem of creating bipedal robot legs.

>tl;dr
ITT post good robowaifu legs

>tech diagrams sauce
www.youtube.com/watch?v=pgaEE27nsQw
www.goatstream.com/research/papers/SA2013/SA2013.pdf
185 posts and 73 images omitted.
>>33969 >muh le 2000lb motor >Lolwut? A 2000lb winch motor, each motor pulls 15A for a total of 180 watts.
>>33972 Oh, OK got it. Heh, I thought you were saying the motor weighed a ton. :^)
Thanks for all the YouTube links, but please add some some description when posting. Ideally more than one line. These videos have a text description on YouTube which can be copied. With ">" you can quote it, though make sure to put this in front of each paragraph when doing so. Cheers.

Open file (428.51 KB 1500x1063 general_engineering-01.jpeg)
Open file (150.45 KB 1024x747 tools_(resized).jpg)
R&D General NoidoDev ##eCt7e4 07/21/2023 (Fri) 15:25:47 No.24152 [Reply] [Last]
This is a thread to discuss smaller or general waifu building problems, solutions, proposals and questions that don't warrant a thread or touch on more than one topic. In a way this is a technical meta, minus news. Keep it technical. A lot of topics in the old thread here >>83 have a thread on their own by now. The main topics in the old thread with the link to the related dedicated threads are listed here - it was mostly about actuation at the beginning: Topics in the old OP: - liquid battery and cooling in one (flow batteries) >>5080 - artificial muscles (related to actuators >>12810) - high level and low level intelligence emulation (AI) (related to AI >>77 >>22 >>250 >>27 >>201) - wear and maintenance, including repairs - sanitation >>1627 (related to actuators >>12810) > cheap hydraulic and pneumatic muscles > woven sleeves out of strong nylon fishing line > exhaust excess heat by breathing and panting (related to thermal management >>234) >>1635 (related to energy systems >>5080) > sitting in her 'recharging chair'

Message too long. Click here to view full text.

158 posts and 53 images omitted.
Open file (106.90 KB 827x614 8X-overunity.png)
I was looking into Joule thief circuits and came across a site where a guy claimed he somehow put out 8x as much power as the input with a few small modifications to the circuit: https://www.homemade-circuits.com/8x-overunity-circuit-using-joule-thief/ I haven't tested it yet myself, but I'd be happy if it was just a more efficient version of the regular circuit.
Open file (95.82 KB 788x790 CycloidExample.png)
A website to help you develop your own cycloid actuators. https://mevirtuoso.com/cycloidal-drive/
>>33836 Very cool, thanks Light! BTW, if I haven't greeted you before, then welcome! Cheers. :^)
>>33836 Very nice. Thanks.
>>33836 Thanks, but the big deal for me was finding the formula for OpenScad by Dan Fekete on Youtube mentioned here >>24401 Maybe using something like this her and then importing the result into a program for further modification might also work, I could imagine.

Report/Delete/Moderation Forms
Delete
Report