/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Downtime was caused by the hosting service's network going down. Should be OK now.

An issue with the Webring addon was causing Lynxchan to intermittently crash. The issue has been fixed.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


Vera stayed by Anon's side, continuing to support him in building new programs, but their primary goal was no longer work or money or fame.


Bipedal Robot Locomotion General Robowaifu Technician 09/15/2019 (Sun) 05:57:42 No.237
We need to talk about bipedal locomotion. It's a complicated topic but one that has to be solved if we are ever to have satisfyingly believable robowaifus. There has surely already been a lot of research done on this topic, and we need to start digging and find the info that's out there. There are some projects that have at least partial robolegs solutions working, but none that I know of that look very realistic yet. We likely won't come up with some master-stroke of genius and solve everyone's problems here on /robowaifu/, but we should at least take a whack at it who knows? We certainly can't accomplish anything if we don't try.

I personally believe we should be keeping the weight out of the extremities – including the legs – while other anons think that we should add weight to the feet for balance. What's you're ideas anon? How do we control the gait? How do we adjust for different conditions? What if our robowaifu is carrying things? What about the legs during sex? Should we focus on the maths behind MIP (Mobile Inverted Pendulum), or is there a different approach that would be more straightforward? A mixture? Maybe we can even do weird stuff like reverse-knee legs that so many animals have. Robofaun waifu anyone? What about having something like heelys or bigger wheels in the feet as well?

I'm pretty sure if we just put our heads together and don't stop trying, we'll eventually arrive at least one good general solution to the problem of creating bipedal robot legs.

>tl;dr
ITT post good robowaifu legs

>tech diagrams sauce
www.youtube.com/watch?v=pgaEE27nsQw
www.goatstream.com/research/papers/SA2013/SA2013.pdf
Here's some thinking out loud about walking and the steps needed to program this or "a" numerical structural system to make this work. A strategy. So as I walk about, I started thinking about what I'm doing. It appears, to me, that walking is sort of a natural preprogrammed act where the muscles mostly know what to do. Brain tells body go here, go there, it goes here, goes there. So I started thinking about how we could do the same. I came up the idea that with small amounts of data passed to the muscles we could get good movement. My assumptions are we are using micro-controllers to control each muscle, with some extra processing power for each muscle. The first step is the brain makes a "map" of the terrain in front of it, what speed it is going at and then where it needs to step for that speed or terrain. I watched a Jim Keller interview with Lex Fridman and Keller said that this sort of distance measurement was trivial. In other words using two eyes you can tell where something is like the floor and how far away. He should know, he's a legend at AMD and Tesla, designing chips for...everything. So let's take his word for this, (yes there are likely to be complications but let's ignore for now). We will assume finding a place to step for the waifu is not big deal. A simple table of sorts could be built up of how far a step is depending on speed it wants to go and using that it maps where a good place to step is. So now we have that, then, we need a vector to this place. Now a leg, thigh, foot, movement is going to be constrained to how they can bend so if the brain sends a start moving vector(a angle for the foot to move AND a velocity), a end point to go to, then this set of vectors from the brain could be interpreted by ALL the muscles in the leg. Each one knows that to move here or there, it must act in a certain way. So with this one vector based on, move this way x, this way Y and this way Z, plus a speed, plus the end point, each muscle can work on its on to add up to this end point. So you have, (Each 2 bytes(16 bits) giving us 65,536 different values or postions) A starting direction for foot movement, x,y and z. If you need to step over something it will have a high value for "Y". It will pick the foot up high.(6 bytes) A velocity point for x,y and z, how fast to move(two bytes x, two bytes Y and two bytes z) (6 bytes) End point (two bytes x, two bytes Y and two bytes z) (6 bytes) and a foot position x and y for how the foot needs to land on the ground.(4 bytes) (22 bytes total) This will create a vector that tells the foot, leg, thigh, hip how far it is to lift the foot, how fast and what direction x,y,z. This data is really just where the foot moves. What direction and how fast to move in that direction, where it is to move the foot to and the position of the foot when it lands. All the muscles have to do a certain task to do this and they "know" what to do, they do their part by only feeding them foot movement data. Another thing I think that would be needed to give the waifu a real strong grasp on what to do, is to feed the muscles a body position based on the body mass point at the shoulders and the hip AND a velocity vector x,y and z of what the hips and shoulders are doing in respect to movement. I think these two measurements could really add a lot to it's position awareness with a small amount of data being passed. In this case. shoulder x,y,z position(6 bytes) shoulder x,y,z velocity(6 bytes) hip x,y,z position(6 bytes) hip x,y,z velocity(6 bytes) (two feet, 40 bytes x 2, so 80 bytes total)for feet and body in all. At 50KBs with 80 bytes, let's add in some error correction and maybe some other stuff so make it 500 bytes just to throw a number at it and you could do this a 100 times a second or every hundredth of a second. That's fairly good at a really slow network speed, though I would like to see the time interval smaller and kick the transfer speed up a little. It worth noting that if each processor can figure out it's moves on this small set of data it doesn't need continuous data streams. Only one set for each step. The walking models, that say they are successful, say they add refinements for the last part of the step so you could send a start and then small refinements as it continues through the step with way, way, less data than 500 bytes every 1/100 of a second. There's also two paths that could be followed. The main brain could figure out the proper step pattern for a specific body position and just send the 22 bytes foot movement step, velocity vector and end point to the muscles to walk. With each muscles processor figuring the actuation needed to preform that specific movement. Or it may be that some combination that changes depending on circumstance is used. Where sometimes the muscles get body data, (the shoulder, hip), positions and sometimes they just get foot movement data. I was pondering this and just decided to jot down what I was thinking.
A good link that might later become valuable to train waifus. The WILDTRACK Seven-Camera HD Dataset (repack) This is a dataset of people walking through a square recorded on camera. If analyzed for joint movement and then programmed into the waifu it could show it what it needs to do to walk. Difficult, but a good resource if you know what to do with it. 60GB large but is a torrent file so easy to download. https://academictorrents.com/details/6d5542b0d245ff4d37680f67f2fb96750e6d8c60
>>21602 This is really great & interesting stuff Grommet. I hope to have a long conversation with you about such details before too long. Cheers! :^)
Here's an old concept that's still pertinent for us today, /robowaifu/ : >ZERO-MOMENT POINT — THIRTY FIVE YEARS OF ITS LIFE abstract: >This paper is devoted to the permanence of the concept of Zero-Moment Point, widely-known by the acronym ZMP. Thirty-five years have elapsed since its implicit presentation (actually before being named ZMP) to the scientific community and thirty-three years since it was explicitly introduced and clearly elaborated, initially in the leading journals published in English. Its first practical demonstration took place in Japan in 1984, at Waseda University, Laboratory of Ichiro Kato, in the first dynamically balanced robot WL-10RD of the robotic family WABOT. The paper gives an in-depth discussion of source results concerning ZMP, paying particular attention to some delicate issues that may lead to confusion if this method is applied in a mechanistic manner onto irregular cases of artificial gait, i.e. in the case of loss of dynamic balance of a humanoid robot. >After a short survey of the history of the origin of ZMP a very detailed elaboration of ZMP notion is given, with a special review concerning “boundary cases” when the ZMP is close to the edge of the support polygon and “fictious cases” when the ZMP should be outside the support polygon. In addition, the difference between ZMP and the center of pressure is pointed out. Finally, some unresolved or insufficiently treated phenomena that may yield a significant improvement in robot performance are considered.
>OpenWalker >Framework Architecture >1.1 ProblemStatement >Biped and humanoid robots are complex systems composed by a large number of actuated degrees of freedom and various sensors that provide the requirements for the complex process of walking. Due to such complexity and number of components, in the past decades, the community of humanoid robots has developed several robot architectures for different purposes and applications. All these robot architectures, ranging from small-size hobby robots to giant manned walking machines, share some anatomical similarities developed through years of work in this field. Nevertheless, the task of coordinating the information generated by all the sensors and actuators in humanoid robots poses a complex software problem which has been approached with different paradigms. Consequently, a large variety of different software frameworks has been created which, in most of the cases, are intrinsically connected to the robot hardware that they were designed for. This hardware dependency makes it difficult to develop software packages which can be reused by other teams. As a result, the robot developers must continuously re-implement essential software modules/components to comply with the requirements and restrictions of their system, slowing down the progress of biped and humanoid robotics. In particular, two of the most essential components required by this type of robots are the balance and walking controllers. Software will be the key to solving robowaifu bipedal locomotion. Everything is an interdependent-system of course, but without good software, our robowaifu's hips, legs, & feet are simply mannequins.
I suggest researching prosthetics. They are the field for bipedal movement eith the highest research. The most annoying part is the knees and ankles.
>>21776 Great advice Anon, thanks! The OP opener image ITT is a high-tech prosthetic leg.
>>21810 Yeah I noticed. That leg is the prlsthetic that started me on this resesrch rabbit hole. That and the prosthetic a guy did for his missing fingers. 100% mechanical. Wich is impressive. Some stuff I think can be done mechanically that don't need actuators. Atleast that is my theory.
>>21811 >Some stuff I think can be done mechanically that don't need actuators. Atleast that is my theory. That's actually a very solid engineering approach to problems in general, Anon. Anything we can do/use to simplify the systems we're working with is a big plus for us in numerous ways.
>>21813 Been thinking about this a lot. I wonder if mechanical systems like windup clocks or gravity feed systems can be used. They have an input output binary like in digital systems that might help bring cost down possibly.
>>21849 Just like all engineering, it's a series of choices each involving tradeoffs. We've basically 'coalesced' around batteries + BLDC actuator systems because they--in effect--bring the greatest benefits for the least costs. That being said, once the era of robowaifus begins, then I doubt not that many very clever men will find ways to further-optimize her systems, which very likely may include numerous mechanical contrivances. However it goes, it's going to be a highly-interedasting field to come! :^)
>>21811 Technically, bipedal motion can be defined by hip and trunk actuation. Humans walk as pendulums which use our muscles to pump the oscillation. Researching passive dynamic walking will help you. >>21813 Indeed, one thing that every engineer learns through every success and failure is simplicity is the absolute king. >>21849 You're well on your way with that way of thought. Human walking utilizes Both gravity and stored potential energy through our elastic ligaments, which can be thought of as a windup mechanism. Using clever designs and elastic elements like springs and rubber bands is actually essential for approaching human walking efficiency. Picrel is an old walking mechanism that relies on gravity for power with compelety passive knees preventing the feet from tripping it over. There are many flaws I can see at a glance, for a fun activity, try annotating the design with how you'd improve it. Analysis for the sake of improvement can be a valuable way to quickly gain deeper understanding. An easy hint is that humans move our chests side to side while walking. Why waste calories moving a large mass? How does this lower overall caloric use in walking? >>21852 Watching this field has been enlightening and interesting indeed. Picrel 3 was what got me into the field many years ago. For more information on it: http://ai2001.ifdef.jp/mr1/mr1.html What brought you into the field?
Great video for learning from example https://yewtu.be/watch?v=nVC9D9fRyNU
>>21859 >Picrel 3 was what got me into the field many years ago. Interesting >For more information on it: >http://ai2001.ifdef.jp/mr1/mr1.html Thanks! >What brought you into the field? I've always wanted a robot friend since boyhood, plus AI was an interesting topic as well. But as an adult, it's the concern for the welfare millions of men abused by the Satanic machinations of the Globohomo that drives this passion regarding robowaifus. (>>14500) And very-specifically, it's for Anon that I'm working towards this robowaifu goal first and foremost. Normalfags already have metric boatloads of support systems haha. :^) But yeah, I always thought robots were cool long before I thought much about the waifu aspect of it.
>>21753 >ZERO-MOMENT POINT Yes really helpful!
>Learning Agile Soccer Skills for a Bipedal Robot with Deep Reinforcement Learning This study investigates the use of deep reinforcement learning to train a miniature humanoid robot with 20 actuated joints to play a simplified 1v1 soccer game. The robot learned dynamic movement skills such as walking, turning, kicking, and fall recovery, as well as basic strategic understanding of the game. The agents were trained in simulation and transferred to real robots with minor hardware modifications and basic regularization of behavior during training. The resulting policy exhibited safe and effective movements while still being dynamic and agile, outperforming a scripted baseline in terms of speed and efficiency. Paper: https://arxiv.org/abs/2304.13653 Website: https://sites.google.com/view/op3-soccer
>>22224 Neat! Thanks RobowaifuChat-Anon, and welcome!
>>22224 Great, I hope we can do that when we need it. Mujoco: https://mujoco.org/
Open file (94.81 KB 662x883 FreaG003lc.jpg)
I'm going to repost this from the short stack thread: http://davidbuckley.net/DB/index.htm This guy made a bunch of simple bipedal walker robots that might be useful for early robowaifus.
>>22542 Very cool, thank you Anon. Kinda reminds me of some of the droids from St*rwars.
>>22224 stop bullying him!!! D:
>>22688 Excellent find Anon! Looks very promising. They've managed to capture some fairly realistic human-like motions for a prototype. I'll be interested to see more technical information from this research group in the future. Thanks Noidodev. Cheers. :^)
Open file (539.05 KB 768x432 Strandbeest.jpg)
I know there is a mechanical thread , but what if the solution of the affordable robowaifu is in the Jansen walking mechanism? https://www.youtube.com/watch?v=C7FMIRfP1tk Changing the length of the links dynamically probably and finding the correct set of nodes could lead to something functional and aesthetic.
>I know there is a mechanical thread We even have one for monster waifus: >>10259 - good luck...
>>22728 I'm going to merge this thread soon into our Bipedal Locomotion thread, OP. Jansen mechanisms (and similar ones) have been discussed here on /robowaifu/ before. Allow me to agree with Noidodev, and wish you good luck Anon! Cheers. :^)
James Bruton showing the basics of bipedal robots and the challenges in another example: https://youtu.be/WJKhpGFg4uU - not really close to a human but maybe educational.
Personally I'd like to break this down from most doable to hardest. Bipedal motion is probably hardest and we should focus on AI last just saying.
>>22786 Good idea Anon. Mind breaking down our big problems into all the 'little problems' you can think of, then posting them here as a list, ranked "Easiest to Hardest" ? TIA.
>>22786 Totally agree, but still I keep an eye on it. If I see videos about how to archive it and learn about it, I post them. Bipedal walking itself can be broken down into smaller pieces, and some simple walking with guidance from a human would already make movement from place to place easier.
>>22787 Here's my guess from easiest to hardest robot skin robot hands robot head robot balance robot mouth robot arms robot legs walking But that's just off the top of my head based on my guesstimates just now...
>>22808 Thanks Anon! If you ever take the time to break this list down methodically, I suggest posting it to our How to solve it thread.
>>22814 I just asked bard to arrange it Here is the list of the items from easiest to hardest to do: Robot skin Robot hands Robot arms Robot legs Robot head Robot balance Robot mouth Walking Robot skin is the easiest to do because it is simply a covering for the robot's body. Robot hands are also relatively easy to do, as they can be made from a variety of materials and can be designed to perform a variety of tasks. Robot arms are also relatively easy to do, as they can be made from a variety of materials and can be designed to reach and grasp objects. Robot legs are also relatively easy to do, as they can be made from a variety of materials and can be designed to walk and run. Robot head is more difficult to do than the other items on the list, as it needs to be able to perform a variety of tasks, such as seeing, hearing, and speaking. Robot balance is also more difficult to do than the other items on the list, as it needs to be able to keep the robot upright and stable. Robot mouth is also more difficult to do than the other items on the list, as it needs to be able to perform a variety of tasks, such as eating, drinking, and speaking. Walking is the most difficult item on the list, as it requires the robot to be able to coordinate its movements in order to move forward. It is important to note that this is just a general list, and the difficulty of creating a robot will vary depending on the specific design and capabilities of the robot. The almighty AI has spoken.
>>22819 lol I just reread that eating and drinking wtf
>>22820 okay got a completely different answer from chatgpt Arranging these tasks from easiest to hardest: 1. Robot skin: Adding a protective or decorative covering to the robot's body is relatively straightforward and doesn't involve complex mechanics or functionality. 2. Robot mouth: Constructing a robot mouth to mimic basic movements, such as opening and closing, can be achieved through simple mechanisms like servos or pneumatic actuators. 3. Robot hands: Designing and building robot hands that can perform basic grasping and manipulation tasks requires more complexity and precision, but it's a well-explored area in robotics. 4. Robot arms: Developing robot arms involves creating articulated structures with joints and implementing motorized actuators for movement. It requires a higher level of mechanical design and control. 5. Robot legs: Designing and implementing robot legs for walking or locomotion involves complex mechanisms for stability, balance, and coordinated movement. It requires advanced control algorithms and sensor integration. 6. Robot head: Building a robot head involves integrating various components like cameras, sensors, actuators for facial expressions, and potentially speech synthesis capabilities. It requires mechanical design, electronics, and programming skills. 7. Robot balance: Achieving stable and dynamic balance for a robot is a significant challenge. It involves complex control algorithms, sensor fusion, and precise actuator control to maintain equilibrium and adjust to external forces. 8. Walking: Developing a walking robot is one of the most challenging tasks. It requires a combination of mechanical design, kinematics, motion planning, and control algorithms. Achieving stable and efficient locomotion across different terrains demands advanced techniques like dynamic balance and gait optimization. Please note that the difficulty of each task can vary based on the specific requirements and the level of complexity desired. This ranking provides a general idea of the relative complexity involved. chatgpt>>>>>>>>>>>>>>>>>bard
>>22821 one last thing we really should give that anon that made the robot head some props considering its so up there. Hope he didn't leave.
>>22808 Thanks, but we have a whole thread with making overviews, planning, diagrams, and subtthread does something quite similar what you're doing here: >>9554 - breaking down the development into smaller steps. Also, please consider the LLMs are likely not better than his board in knowing things in regards to robowaifus, since sadly we're somewhat the experts on this. On top of that, they're libtard censorship challenged, which means they might not know a robowaifu is supposed to be something like woman, what a good looking women is supposed to look like, and so on.
here's something I've thought up. It'd be rough and rigid but it'd get the job done I think.
>>23177 Please work a bit more on your ideas before you post them, and add a description.
>>23180 omg dude chill. What I just posted is fine. Clearly you never played with mechanix as a kid. Its not ideal but ideal will never see the light of day.
>>23180 Do you WANT this board to die? I get the drawing is completely childish, but your post is completely snobbish. Encourage the newcomers ffs.
>>23229 Adults often times draw on napkins to express their ideas too ehem
Related linkage mechanism, not only for walking >>24791
> HRC Model 3 Bipedal Robot IK Test by Letic Z https://youtu.be/i6j0hg5ZIPE
>>26403 Neat! Thanks NoidoDev.
Open file (78.13 KB 922x1396 Femisapien-a21.JPG)
Attached is an inside view of the "Femisapien" toy from Wowee in the mid 2000s. The leg mechanism uses only 3 motors, with the ankle synced to the hip via bowden cable and hip linked to knee via parallelogram. 1 motor to move hips side/side, and one motor per leg to move them forward/back. Demo Vid: https://www.youtube.com/watch?v=rNezxoSEEB0
>>26432 Mark Tilden is a genius tbh. (>>10257) There's a concept called Neuromorphics. It's basic tenets have in some cases been solved in large degree simply by clever designs using standard analog electronics. We can do with a bit more of that style of insight here on /robowaifu/ IMO. Cheers. :^)
Open file (3.06 MB 1844x2792 0Femisapien.png)
>>26432 Femisapien is even simpler/stupider. That's just a floating steel rod,not a bowden tube. There's nothing that moves her hips. She is underactuated, relying on her bodies dynamics to move her hips. Those springs help to reduce needed energy to shift her upper bodies mass over to one leg. This is achieved by throwing mass (her arms) to alter her upper bodies center of gravity, which causes her to tip with the help of those springs. It's a great demonstration of Keep It Simple Stupid, reducing costs to an absolute minimum by using dynamics to simulate the functions of servos that aren't there. Her arms being able to turn her elbows, move the arm up and down, inside and out, all with one motor and potentiometer each, is also clever. Distribution of mass to help even out the torque needed to move her legs across their arc, and attention to her periods as a system of pendulums is worthy of note.My all time favorite biped. There's so much to love about her design.
>>26502 >She is underactuated Clever design work. After all, the simplest, most robust parts are the ones that aren't there! :^) Same is even moreso when it comes to software. As mentioned here (>>26447), if we can use simple analog electronics parts to fill in for what otherwise would need complex software to achieve, the parts attributes above can be expanded to include faster as well. Thanks for the nice breakdown analysis Kiwi. I wish the two of you had physical copies of Femisapien so you could really do proper teardowns/reverses. Cheers. :^)
Human Moves, Robotic Grooves: How AI Mimics Human Motion in Stunning Detail https://youtu.be/3LkWydJq6y0

Report/Delete/Moderation Forms
Delete
Report