/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


Bipedal Robot Locomotion General Robowaifu Technician 09/15/2019 (Sun) 05:57:42 No.237
We need to talk about bipedal locomotion. It's a complicated topic but one that has to be solved if we are ever to have satisfyingly believable robowaifus. There has surely already been a lot of research done on this topic, and we need to start digging and find the info that's out there. There are some projects that have at least partial robolegs solutions working, but none that I know of that look very realistic yet. We likely won't come up with some master-stroke of genius and solve everyone's problems here on /robowaifu/, but we should at least take a whack at it who knows? We certainly can't accomplish anything if we don't try.

I personally believe we should be keeping the weight out of the extremities – including the legs – while other anons think that we should add weight to the feet for balance. What's you're ideas anon? How do we control the gait? How do we adjust for different conditions? What if our robowaifu is carrying things? What about the legs during sex? Should we focus on the maths behind MIP (Mobile Inverted Pendulum), or is there a different approach that would be more straightforward? A mixture? Maybe we can even do weird stuff like reverse-knee legs that so many animals have. Robofaun waifu anyone? What about having something like heelys or bigger wheels in the feet as well?

I'm pretty sure if we just put our heads together and don't stop trying, we'll eventually arrive at least one good general solution to the problem of creating bipedal robot legs.

>tl;dr
ITT post good robowaifu legs

>tech diagrams sauce
www.youtube.com/watch?v=pgaEE27nsQw
www.goatstream.com/research/papers/SA2013/SA2013.pdf
Open file (14.92 MB 800x600 Healthy_2D.mp4)
Open file (90.53 KB 352x768 nms-simulator-2.png)
Neuromechanical simulation of human locomotion (SimGait + HBP)

https://biorob.epfl.ch/research/research-dynamical/simgait/
Open file (415.45 KB 3503x2478 snapshot_V2.png)
>>1601
Multi-Modal Locomotion
By the same lab. Includes executable simulation software links (Win, Mac).

https://biorob.epfl.ch/research/research-humanoid/research-humanoid-walkman/
>>1602
Related papers:

Walking controller
>Imprecise dynamic walking with time-projection control

Human-like animations
>Scalable closed-form trajectories for periodic and non-periodic
human-like walking

sauce:
github.com/salmanfaraji/Walking3LP
Neural Control of Balance During Walking
>Neural control of standing balance has been extensively studied. However, most falls occur during walking rather than standing, and findings from standing balance research do not necessarily carry over to walking. This is primarily due to the constraints of the gait cycle: Body configuration changes dramatically over the gait cycle, necessitating different responses as this configuration changes. Notably, certain responses can only be initiated at specific points in the gait cycle, leading to onset times ranging from 350 to 600 ms, much longer than what is observed during standing (50–200 ms). Here, we investigated the neural control of upright balance during walking. Specifically, how the brain transforms sensory information related to upright balance into corrective motor responses. We used visual disturbances of 20 healthy young subjects walking in a virtual reality cave to induce the perception of a fall to the side and analyzed the muscular responses, changes in ground reaction forces and body kinematics. Our results showed changes in swing leg foot placement and stance leg ankle roll that accelerate the body in the direction opposite of the visually induced fall stimulus, consistent with previous results. Surprisingly, ankle musculature activity changed rapidly in response to the stimulus, suggesting the presence of a direct reflexive pathway from the visual system to the spinal cord, similar to the vestibulospinal pathway. We also observed systematic modulation of the ankle push-off, indicating the discovery of a previously unobserved balance mechanism. Such modulation has implications not only for balance but plays a role in modulation of step width and length as well as cadence. These results indicated a temporally-coordinated series of balance responses over the gait cycle that insures flexible control of upright balance during walking.

<Keywords: balance, walking, neural feedback, vision, virtual reality, sensorimotor control

www.ncbi.nlm.nih.gov/pmc/articles/PMC6146212/
Open file (126.60 KB 560x477 zad0021028820002.jpg)
Dynamic Principles of Gait and Their Clinical Implications
>A healthy gait pattern depends on an array of biomechanical features, orchestrated by the central nervous system for economy and stability. Injuries and other pathologies can alter these features and result in substantial gait deficits, often with detrimental consequences for energy expenditure and balance. An understanding of the role of biomechanics in the generation of healthy gait, therefore, can provide insight into these deficits. This article examines the basic principles of gait from the standpoint of dynamic walking, an approach that combines an inverted pendulum model of the stance leg with a pendulum model of the swing leg and its impact with the ground. The heel-strike at the end of each step has dynamic effects that can contribute to a periodic gait and its passive stability. Biomechanics, therefore, can account for much of the gait pattern, with additional motor inputs that are important for improving economy and stability. The dynamic walking approach can predict the consequences of disruptions to normal biomechanics, and the associated observations can help explain some aspects of impaired gait. This article reviews the basic principles of dynamic walking and the associated experimental evidence for healthy gait and then considers how the principles may be applied to clinical gait pathologies.

www.ncbi.nlm.nih.gov/pmc/articles/PMC2816028/
Open file (580.75 KB paper.pdf)
Modeling robot geometries like molecules, application to fast multi-contact posture planning for humanoids
>Traditional joint-space models used to describe equations of motion for humanoid robots offer nice properties linked directly to the way these robots are built. However, from a computational point of view and convergence properties, these models are not the fastest when used in planning optimizations. In this paper, inspired by Cartesian coordinates used to model molecular structures, we propose a new modeling technique for humanoid robots. We represent robot segments by vectors and derive equations of motion for the full body. Using this methodology in a complex task of multi-contact posture planning with minimal joint torques, we set up optimization problems and analyze the performance. We demonstrate that compared to joint-space models that get trapped in local minima, the proposed vector-based model offers much faster computational speed and a suboptimal but unique final solution. The underlying principle lies in reducing the nonlinearity and exploiting the sparsity in the problem structure. Apart from the specific case study of posture optimization, these principles can make the proposed technique a promising candidate for many other optimization-based complex tasks in robotics.

<Terms: Humanoid Robots, Kinematics, Biologically-Inspired Robots, Task Planning, Gesture, Posture and Facial Expressions, Optimization and Optimal Control

https://infoscience.epfl.ch/record/230040/
Edited last time by Chobitsu on 11/28/2019 (Thu) 16:54:52.
>>237
I think the most fundamental thing for walking is balance. It could be optimized for as the minimum amount of energy to maintain a pose or state. It's really important we acknowledge robowaifus will have different bodies and parts than us and what they find balanced to walk with will be something else entirely unless we pay close attention to engineer their bodies with similar weight, range of motion and forces as us.

The second most important thing is experience. Robowaifus will need to practice to improve their world model's prediction function to judge the weight of objects, their weight distribution, slipperiness of surfaces and such to understand what the minimum energy state will be. There are things robots don't have sensors for like feeling a cylinder sliding through their grip, but there are visual cues and other things that can be detected such as slipping objects feeling lighter. Robots can also have completely different sensors in their body parts such as accelerometers and gyroscopes.

Objects in general need to be approached with caution and gentleness, using only the minimum amount of force needed to achieve tasks. Overexertion requires corrections and can be identified within the system as shaking. In disaster situations rubble and debris may also move when walked upon. Motion planning will need to navigate uncertain environments and avoid potential motor stalls and falls by taking the safest understood route, unless there is no other way to go or it's too dangerous and should stop. Collisions with other people, stairs and moving vehicles must be also factored in but I think finding the minimum energy required will naturally solve this because walking into a bus going 50 kph is gonna be a lot of energy to overcome.

I wish I had some experience with 3D programming. If someone could put together a simple mechanical simulation or find one, I could rig up some AI to first master using its limbs and then optimize it to walk towards a goal with minimum energy. It would help us study how limb weight, weight distribution, joint placement, actuator power and range of motion will affect gait and we can come up with better designs that can balance more like a human being or something else entirely that's beautiful and practical in its own way.
>>1608
>If someone could put together a simple mechanical simulation
<or find one,
would this one do?
>>1602
Open file (33.52 KB 246x251 simulation.png)
>>1610
Not sure, I'm on Linux and would have to see if it can run in Octave. I do most of my AI in PyTorch and it seems Octave has a Python interface.

I think this will be helpful but I was thinking more of a simulation where weights can be attached to body parts for more realistic torque calculations. That way we can test, design and evolve parts to be more functional in the real world.
>>1611
Well, I'm setting up Octave myself for the very first time right at this moment to try and test the program, so you're ahead of me heh.

Yes, that diagram helps. I'm think about building a 3D model for robowaifus starting with a graph description of the bones for rigging using boost.graph. Once that's workable to the degree where mesh data can be bound to the bones and rendered, do you think you could work with it?
Here's the error I'm getting trying to run the m script w/ Octave:
johnny@mactoshub:~/_msc/rw/repos/Walking3LP$ octave Walking3LP.m
error: 'gui_mainfcn' undefined near line 62 column 5
error: called from
Walking3LP at line 62 column 5
>>1611
BTW, if you examine the GUI image, you see that mass can be added/removed from limbs for the sim runs. Looks very flexible to me tbh.
>>1614
Here's the code section in question. The error line mentioned is the one inside the else clause. Apparently 'gui_manfcn()' is a Matlab API which Octave must have a different name for. Any matlab guys here?
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @Walking3LP_OpeningFcn, ...
'gui_OutputFcn', @Walking3LP_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
>>1615
>gui_mainfcn() *
Open file (603.31 KB 866x523 quattroped.png)
There's a simple but inelegant short term solution; add an extra limb with omnidirectional wheels to the bipedal robot. It can be an armature that's detachable at the hip with the wheels behind the body. Or it can retract by folding inside the lower or upper legs depending on the kinematics.

This would allow for decent mobility at a slightly faster than shuffling pace as 3 points of contact on the ground at all times creates static rather than dynamic locomotion. It also lets you add a lot more weight to the robot without worrying about stability.

This is the only robot I've seen that tries a hybrid folding limb approach and is pretty innovative. As for a simple mechanical simulation program I'm also trying to find something that does this. playdynamo.itch.io/dynamo looked promising but it performs terribly in Wine and keeps trying to connect to the internet when blocked. I might give FreeCAD or Blender another try but both of those are very time consuming to get used to.
>>1617
clever idea bout the wrap-around wheels heh. we've had numerous discussions on the alternatives to bipedalism for our robowaifus here over the years. i think the general consensus for those of us not willing to give up on a human-like form for our waifus like myself :^) the nearest workable alternative seems to be to give her a motorized wheelchair to move around with until we can perfect bipedal locomotion.

Still, good ideas anon keep them coming.
>>1617
BTW, I think the Pepper robot uses a tri-wheel scheme for it's base?
>>1618
I see my approach more as an assisted form of bipedalism building up to the real thing. This quick sketch might give a better idea of what I'm talking about. It would still have the appearance of walking so it's not comparable to sitting in a wheelchair or being carted around vertically on a hand truck.

By using two mechanum wheels you get to spin around by running each wheel in opposite directions. Omnidirectional wheels have their own advantages and disadvantages.
>>237
70.000 USD for a fucking prosthetic leg, jfc.
>>1621
I see. Yes that diagram does make it clear. In effect, you are providing her with an irl cartoonishly oversized 'foot' with which to get around on, but one that would shrink back to normal size when not needed.

>>1622
Ever price even the most basic of medical supplies in a hospital context anon? No surprises tbh.
>>1624
>Ever price even the most basic of medical supplies in a hospital context anon? No surprises tbh.

Yep, but 70k is still a lot for a device that's not even worth a fraction of the price in materials or manufacturing. I mean what's to be expected when every megalomaniac cocksucker and their granny is allowed to milk people's health for all it's worth and call it a day.
>>1625
Look I'm not promoting their (((approach))) I'm simply encouraging you to be realistic. Let's continue this off-topic discussion in the Lounge if you'd like.
Open file (831.34 KB 755x902 softfoot.png)
Open file (14.98 KB 173x509 EMIEW2.jpg)
Open file (237.35 KB 373x624 WL-16.png)
>>1621
Looking into humanoid bipedal robots that have a single wheel on its feet this EMIEW2 is the closest one. Other robots typically have several wheels per foot(Zephyr) have the wheel as part of a normal robot foot to add wheeled locomotion(GoRoBot) or more than a pair of legs. The other decent design is the WL-16 leg platform using WS-2 feet but that has a caster alongside a powered wheel.

I've gone through at least 50 research papers on robot walking methods(and have another 50 to go through) and have seen some interesting designs and solutions but none of them use the approach I'm proposing where the wheel is only used to shift the center of gravity during the stride while keeping the robot stable with 3 points of contact on the ground at all times.

One thing that hit me when browsing another forum are those minisegways cost less than $200 now. Rather than trying to do bipedal locomotion I'm going to have my first model balance properly on top of that until a cheap solution is found.
>>1776
>I've gone through at least 50 research papers on robot walking methods(and have another 50 to go through)
Mind posting a few of them here for the rest of us anon?
>>1776
>hoverboards
Yep, we've had this conversation before now. The posts haven't been restored here yet but I came down in favor of the idea myself.
>>1776
Why is that robot in the middle wearing a diaper?
>>1777
Eh most of them aren't worth reading otherwise I'd have uploaded a pack of them here. These two are probably the best I've found on the topic for a general overview.

>>1778
The segway mini isn't really a hoverboard since you use your knees rather than your feet to control it. The center of gravity is much lower to the ground and there's no having to keep the robot upright on it.

>>1794
It looks even more like a diaper from the back. The reason for the protrusion around the abdomen is it's designed to sit in the seiza position with giant feet.
>>1806
>Eh most of them aren't worth reading otherwise I'd have uploaded a pack of them here. These two are probably the best I've found on the topic for a general overview.
Fair enough thanks, thanks for filtering the mediocre for us Anon.
>segway mini
https://www.invidio.us/watch?v=h8GUkc6mzzI
They seem a little more expensive than 'less than $200'. Am I missing something? They do look like a good choice. The robowaifu would still need basic bipedal locomotion to be able to use it effectively (climbing on/off, steering) but nothing particularly good. Good idea.
Open file (18.97 KB 248x500 Roll.jpg)
A lot of good thinking going on here anons, but much of this discussion is fundamentally flawed. We want our waifu to be: -Inexpensive -Easily Reproducible -Low Power From these tenets, having any active stabilization beyond the minimum isn't advisable. She should have legs that naturally conform to walking on various terrains. For this reason, having a tail or some other method of passive balance (a dress that hides balancing struts, big feet with a low center of gravity, or have her hold something like a walker in front of her.) these examples will all work for a robowaifu that is inexpensive and low power as she doesn't waste energy on balance. Roll is a great example of low center of mass with big feet
>>1873 I don't think balance is a concern for complexity or power consumption. Returning to balance from outside forces is easily accomplished. Getting a dynamic gait is the big problem. Maybe we could cheat it with a neural network. Really if you're not just making a sex robot you're going to be using neural networks or something close. A simple solution for movement is to go completely on rails. Not rails literally but railings countertops and other waist-height edges. The waifu would be restricted to the home but would have plenty of well defined static supports to use and only flat surfaces to traverse. She'd just never cross a wide open room or maybe you could throw a pool table in the middle of it. Some Anon have entertained the idea of using a tether for power and could equip their home with a literal overhead rail system that would be far more discrete than the other stabilization ideas mentioned.
>>1882 If you're using rails and tethers, why walk? Also, self balancing is easy for humans but, hard for computers. Neural networks should be used for her personality and communication, it's better to make her walking something that relies on good mechanics rather then complex software. We must keep costs and complexity both in hardware and software, to a minimum.
I'd like to throw another idea in. For very human-like waifus: Wheels are the way to go, but if she should have normal looking feet, then let's build her some kind of rolling boots. Like those for skiing, but less tight and with wheels on the bottom. Batteries included. She would only need to keep balance, but knowing how the contraption behaves is going to help with that. Also her ankle would be hold by the boot and she wouldn't need to move her arms or legs. That aside, if they have good sensors in their feet we should be able to solve bipedal walking at some point anyways. Though, it hasn't the highest priority of course.
>>4331 That's a pretty good idea Anon. Some sort of fore/aft-extendable Heelys might be sufficient to allow her to both skate around and to remain fairly stable. >That aside, if they have good sensors in their feet we should be able to solve bipedal walking at some point anyways. TBH, it will take far more than just good sensors in the feet to solve this but I believe we will in time.
>>1885 >We must keep costs and complexity both in hardware and software, to a minimum. Well said, Anon. Even for non-profit, garage-lab, labors-of-love projects, complexity & costs will always be an issue. And for the entrepreneurs among us, they already know keeping them low is a critical factor to success.
>>4369 I look at at this way: Those who work on more expensive waifus might need simpler prototypes which would be using tech from lower cost bots. So, this doesn't harm the enthusiasts with less money. Also, the richer guys might pick up some other approaches as a side project. The scope here is from virtual, paper waifus to quite human-like with a lot of sensors.
>>4375 >The scope here is from virtual, paper waifus to quite human-like with a lot of sensors. Alright fair enough Anon. My primary ambition with all this is to help literally millions of disenfranchised men build their own robowaifus at home. That means structural drinking straws, 3D-printed parts, cheap paper and plastic shells, cheap electrical motors, batteries, and electronics. The software I'm giving away entirely free at my own expense. For the guys who want to build the US$120K deluxe robowaifus ready to go onto a SciFy set for a long day of shooting latest series, I say more power to them. As you say, there is a wide-scope of interests here at /robowaifu/ and that's a good thing ofc.
Has anyone printed Poppy and tried it with cheaper motors? Anyone ever looked for other people who tried that? Some ideas in the comments: https://hackaday.com/2014/03/25/open-source-humanoid-robot-is-awesom-o/ Files, software, tutorials: https://github.com/poppy-project/poppy-humanoid Their Youtube: https://www.youtube.com/c/PoppyprojectOrgVideos
>>4436 Yeah, you really hit the nail on the head with that question Anon. Poppy is a great little project, but those Dynamixel servos are waaaay too expensive for the torques you get. Granted they are good quality, but unless you're wealthy they are simply too expensive in my opinion. I'm trying to puzzle out a solid approach to having a 'central power unit' in the torso, that uses simple (and cheap) regular DC motors to generate the force, and then distribute that around the body via clever transmission/transduction schemes.
>>4438 Cool. Did you print out Poppy and you're working on that? I don't know much about this stuff, yet. It seems to be important to measure where everything is, then adapt the body or the next move. Did you consider using a neuronal network for that?
>>4440 >Cool. Did you print out Poppy and you're working on that? No. Once I looked at the overall cost of the project, I knew it would be fruitless to start printing parts out yet. Again, the power delivery is simply too cost-constrained r/n. Thus why I'm interested in solving it with regular DC motor which are out there in practically every capacity imaginable and far cheaper for each Newton delivered than high-tech servos are. The servos are for accuracy ofc, but I hope that some clever mechanical/software solutions can address that. >Did you consider using a neuronal network for that? Sure ofc that's the intent with whatever project we do here I think. As I've said for a while now, the AI for our robowaifus will be much, much harder to solve than the physical shell for her. It's going to take a long time, IMO.
Open file (316.07 KB 1500x949 poppy-humanoid-github.jpg)
>>4441 Oh, so you're saying it only makes sense to print it out and try it, when one knows about the size of the replacement dc motors? When I mentioned NN here I didn't mean some AI to create a whole waifu mind, but specifically to putting all the states of the motors and joints in, then learning from falling or not falling. Maybe looking not only into the last state, but the ones before which lead to failing or success.
Does anyone know a website with a search function to compare as many electric motors as possible. Didn't find anything and don't have the nerves for it today. I didn't mean AliExpress. Need to filter them for weight and other values. It not only for finding motors for >>4436 (Poppy), but yeah, this would a good example.
>>4473 Based on what this Skyentific guy on Youtube is using for his arms I endet up to find out that drone or RC motors will probably be what's needed to have the same power and then implement the sensors and controls in another way. I found them with "brushless multirotor motor". More I don't need for now. When I have my printer I might play around with it, first only thinking about sensors. >>1602 Link broken, new one: https://www.epfl.ch/labs/biorob/research/humanoid/walkman/
Open file (27.43 KB 450x450 7x20mm.jpeg)
>>4504 One thing you have to be able to accommodate is extremely high RPMs with these motors, especially the tiny little ones for whoop-type micro drones (~50'000 RPM!). They drain energy rather quickly as you might imagine. But, they can impart a remarkable amount of lbs-ft in short order. Think of them kinda like F1 engines for fairybots heh. Example: Apex RC Products 7x20mm Fast 17,900kv / 66,000rpm CW CCW Brushed Motor Set
Open file (37.18 KB 566x415 67020.jpg)
>>4505 Yes, some gearbox for reducing will be necessary. But, you really picked a extreme one, and also I meant rather the flat motors. In the video https://youtu.be/WKRLlthr9kY he is using something like on the pic. I have no data for it, since it seems to be out of production, similar looking ones have a RPM of 1200-5000 which would need to be reduced with a factor of about 150 or 200, I guess. Maybe that not the right approach anyways, the second picture is how the Dynamixel looks like (with ~200 reduction included). I'll look into it on occasion, don't know yet if it will be possible to get a gearbox separately. It's far from urgent anyways.
>>4510 >But, you really picked a extreme one Hehe. That's because I bought a few of these for use in robowaifu arms and to have an 'excuse' to shell out some readies for a whoop drone :^). These little babys are so lightweight, that you could embed 5 of them into the end of each forearm just below the wrist w/o paying to high a lever-moment kinematic penalty. And, given their ability to wind up basically instantly, their ability to impart a remarkably high amount of (accumulated) force in short order makes them a unique possibility for driving strong, lightweight robowaifu hands. Inexpensive too! As with all engineering, everything is a tradeoff. These scalding little bitches put out the heat force of a thousand suns. In their drone use, the motor's shells are both open to air and located directly under their equally high-speed rotating blades, which cools them off. We'd have no such luxury with them inside of our robowaifu's wrists, and I still haven't managed to puzzle out a reasonable cooling system for a 5-set yet. If we can somehow manage that together, then it would be an ideal use-case for these great little motors.
>>4513 I rather wonder how much torque they have, and how you want to add a gearbox or other reduction? However this might fits better into >>406 for actuators in general. We still don't have a thread for hands and forearms or arms in general. I wonder if there has been one in the old forum and will it be imported?
Reminder: This was possible 6 years ago: https://youtu.be/8uiPHL-xey4 and this is from recently: https://youtu.be/aVJs_x2a9z4 I don't think people should focus on walking, but for example if one doesn't need a human-like form like a doll, without making a lot of noise while walking, then developing something might be much easier. Same goes for something rolling with feet like little tanks: https://www.thingiverse.com/thing:972768 (okay, that's a bit OT in a thread about bipedal walking) Gymnastics and dancing: https://youtu.be/BhYUjSAwNB4
>>5223 >okay, that's a bit OT in a thread about bipedal walking ehh, it's not too bad. we do have a wheelchair waifus thread >>2983, so it might be there, but this is probably the best place imo.
This here >>5358 is about walking, James Bruton released his example of a walking bot 2018.
Walking? Meh! Balance bot wheels for feet. "Rollergirl never takes off her skates".
>>5634 you must into pix plox
Open file (162.01 KB 1594x1093 hovershoes.jpg)
Open file (48.07 KB 900x614 new_cubli.jpg)
>>5635 Some things I have been looking at: #1 Hovershoes. Off the shelf solution for locomotion with many open DIY solutions available. Balance control becomes an issue for weight above hip line. Might suggest thinking about internal gyroscopic stabilization. Give it its own gravity well it its chest. #2 the Cubli robot uses 2 motorized disks to spin up manipulating balance. Legs can act as dynamic shocks instead of primary locomotion and balance. Air muscles are viable for this job since the legs need to loosely hold positions instead of supplying quick motion. What used to be a complex circuit for understanding balance and motor control can be done with an opensource RC flight controller with tweaked PIDs. $.02
>>5645 Thanks. Yes I think both hovershoes or some other form of motorized system could serve well as motive 'footwear' in the interim. And gyro-stabilization is also generally a good idea, if power-hungry. Ultimately, typical human bipedal locomotion is the goal, but in the meantime we'll settle for alternative approaches as needs be. >PIDs. So, for the uninitiate, PID is some sort of feedback control mechanism then?
>>5667 Apologies, that page seems to be account-walled. Here's a playlist https://invidious.snopyta.org/playlist?list=PLn8PRpmsu08pQBgjxYFXSsODEF3Jqmm-y
>>5667 Turns out this guy Brian Douglas has his own independent YT channel. https://www.youtube.com/channel/UCq0imsn84ShAe9PBOFnoIrg
>>5671 Apologies, don't mean to bang on about it, but I'm finding more good resources as I dig along. https://engineeringmedia.com/resources
here we have an interesting mechanism https://www.youtube.com/watch?v=uv-Qp8p8Jqg
>>5930 That is interesting thanks. Is there a paper or anything more available on it Anon?
>>5931 nah this was random video in my youtube feed. I am wondering if this mechanism will be enough to achieve balance
>>5933 Here's his channel, seems like he's done a few of these. Amateur designer? https://www.youtube.com/channel/UC7hw6ztc_mnFh630R0MS5uQ
>>5936 https://www.youtube.com/watch?v=vlBgMg9WFfA men god. Luckily my family is heavy into machinery and maybe I will be able to build this. only the mechanism not the propulsion
>>5982 I think if you used 4 of those MIT Mini Cheetah clone thin disc actuators talked about here >>4890 in the hips, then you could probably pull this off. Apparently they run ~$350-400 ea. so plan on about $1500 for the actuators. But if you can get this working it would be both relatively inexpensive and remarkably efficient afaict. That anon's exact design would need some modification. The support frame for the hips would have to separate out the two pairs far enough apart to work well for the onahole cartridge inside robowaifus. https://biomimetics.mit.edu/
>related xpost >>8704
Feet development, related to DARA project >>8704
Are we not making this harder than it needs to be? For example there are reasons humans have a higher center of mass and it takes more power and effort (and years of practice) to walk: our need to be 5 or 6' tall fluid filled bags for one Lets take advantage of the clean slate we have with robo waifus. Several stacking advantages could make mobility a non-issue 1. Unfixed center of mass in design: as in "megaman" robots (and the Roll example), center of mass is drawn to the Boots, which themselves could be weighted further. (for "intimate" times the boots could be taken off to reveal more human-like feet, when sitting comfortably or lying down, etc). The rest of the structure doesn't need much, we don't need steel I beams for a skeleton, and the power battery and heavier parts could be set lower on the frame for the purpose of lowering the center of mass. 2. Gyroscopes: as in the Cube example above, gyros provide stability via angular momentum which could even be used to "push" against to maintain balance. Not sure how power consumptive gyros are but I imagine once they are set spinning it doesn't take a whole lot more to maintain the momentum with decent bearings and lubrication. 3. AI assisted learning. Same as how we learned to walk. Trial and error until a way to tackle moving from any point A to B across any slope or terrain is simply reflex. (I'd like to talk more about assigning movements to "reflex" CPUs, this is key since obviously we dont go thinking about every specific movement we need to make, we just think "go down stairs" or "open the door" and our habit and reflex handles the rest, correct?) Probably more to add to this list, but I see no reason these three advantages would not stack in a way to solve the problem of mobility simply and elegantly.
>>9053 The idea has been bounced around here of a Roll-like waifu, and certainly keeping track of center-of-gravity is vital for bipedal locomotion. As to your third point, particularly > (I'd like to talk more about assigning movements to "reflex" CPUs, this is key since obviously we dont go thinking about every specific movement we need to make, we just think "go down stairs" or "open the door" and our habit and reflex handles the rest, correct?) This seems to be pretty closely related to the idea of 'Neuromorphic Computing'. (You can find a paper in the Library thread). I'm inclined to agree with your ideas on this, and it's certainly good biomimetics, nervous system-wise. It entails it's own complexities too ofc, but I imagine the tradeoffs will be well worth it. Good ideas, thanks! It would be nice if you'd sketch some ideas out for us so we can see clearly what you mean about 'stacking advantages'?
>>9053 Some of this is food for thought. However, we're going to see different approaches, because people have different priorities. For example: I wouldn't want her legs to feel to light when I lift them, also her not needing to wear boots all the time. For outside, it would be okay though. Also, I don't get the problem until I tried it out myself. Humans do it, so it should be solvable. Another thing is, bipedal walking without help or guidance (walls etc) should be towards the end of any priority list, imho. For a domestic girlfriend it isn't very important. She could walk with help, using walls, being driven or drive herself with some device, walk on all fours, dancing while gripping something (e.g. a pole), standing up after grabbing something, then pose or do the dishes while keeping balance,...
>>9054 >Good ideas, thanks! It would be nice if you'd sketch some ideas out for us so we can see clearly what you mean about 'stacking advantages'? stacking as in multiplying, not merely Adding, each advantage Gyroscope + Low Center of Mass is an order more advantageous than Gyroscope alone or Low Center of Mass alone Because each is a different "vector" and not merely adding to the same vector Does that clarify things?
>>9055 noted, but also consider than a powerful enough internal gyro or set of gyros could effectively be an "internal" balance pole of sorts
>>9057 >Does that clarify things? Yes, it does thanks.
>>9672 Thanks kindly for the links, Anon.
>>9058 Agreed. IMO, any practical robowaifu walking system today -- given our current rudimentary state of kinematics sensing, planning, and animation -- will necessarily have some kind of gyroscopic stabilization to succeed. Here's an example of what good gyro systems can do >>5645 > the Cubli robot https://robohub.org/swiss-robots-cubli-a-cube-that-can-jump-up-balance-and-walk-across-your-desk/
Finally someone did a study on minimizing energy consumption in gaits. >We focus on the problem of developing efficient controllers for quadrupedal robots. Animals can actively switch gaits at different speeds to lower their energy consumption. >In this paper, we devise a hierarchical learning framework, in which distinctive locomotion gaits and natural gait transitions emerge automatically with a simple reward of energy minimization. >We use reinforcement learning to train a high-level gait policy that specifies the contact schedules of each foot, while the low-level Model Predictive Controller (MPC) optimizes the motor torques so that the robot can walk at a desired velocity using that gait pattern. >We test our learning framework on a quadruped robot and demonstrate automatic gait transitions, from walking to trotting and to fly-trotting, as the robot increases its speed up to 2.5m/s (5 body lengths/s). We show that the learned hierarchical controller consumes much less energy across a wide range of locomotion speed than baseline controllers. https://arxiv.org/pdf/2104.04644.pdf This could be adapted to bipedal walking by creating different expert policies for not only different speeds but also carrying items, moving hands and arms around, wearing a backpack, different size weights on the chest, cat girl tails and so on.
Adversarial Motion Priors for Stylized Physics-Based Character Control >Synthesizing graceful and life-like behaviors for physically simulated characters has been a fundamental challenge in computer animation. Data-driven methods that leverage motion tracking are a prominent class of techniques for producing high fidelity motions for a wide range of behaviors. However, the effectiveness of these tracking-based methods often hinges on carefully designed objective functions, and when applied to large and diverse motion datasets, these methods require significant additional machinery to select the appropriate motion for the character to track in a given scenario. >In this work, we propose to obviate the need to manually design imitation objectives and mechanisms for motion selection by utilizing a fully automated approach based on adversarial imitation learning. High-level task objectives that the character should perform can be specified by relatively simple reward functions, while the low-level style of the character's behaviors can be specified by a dataset of unstructured motion clips, without any explicit clip selection or sequencing. These motion clips are used to train an adversarial motion prior, which specifies style-rewards for training the character through reinforcement learning (RL). The adversarial RL procedure automatically selects which motion to perform, dynamically interpolating and generalizing from the dataset. >Our system produces high-quality motions that are comparable to those achieved by state-of-the-art tracking-based techniques, while also being able to easily accommodate large datasets of unstructured motion clips. Composition of disparate skills emerges automatically from the motion prior, without requiring a high-level motion planner or other task-specific annotations of the motion clips. We demonstrate the effectiveness of our framework on a diverse cast of complex simulated characters and a challenging suite of motor control tasks. https://xbpeng.github.io/projects/AMP/ Basically a way to use adversarial learning to imitate reference motions while seeking to solve a goal.
>>10160 >Finally someone did a study on minimizing energy consumption in gaits. Excellent. I consider that topic to be essential for finally achieving the idealized goal of a human-like robowaifu without having a state-sponsored R&D budget available. This will only just be on the very edge of what's even technically feasible at the moment.. If it's achieved at all, it will be due to everything on board the robowaifu being designed to be as efficient and economic as possible. Macro physical behaviors like walking will consume significant energy resources. So, not only will efficient gaiting be needed to even work in a natural lifelike way, but it should also make it possible to have our robowaifus walk around for more than 2-3 minutes without immediately needing to do a full 8-hour deep recharge before she can move for another 2-3 minutes, etc., etc. Ever wonder why a humanoid robot that's as well-funded as even Honda's ASIMO is never shown walking for more than just a few steps in any promotional? Because it's gait is just about as inefficient as it could be, power-consumption wise. And even significant steps forward in this arena such as Boston Dynamics parkour bot aren't shown going on long walks, in their case it's more a question of not keeping weights to a minimum, even though the gaiting algorithms are much better in that case. Every.thing. on board must be optimized for robowaifus to succeed and become the reality we all dream of. Everything.
>>10164 I think in real use robowaifus will be more like laptops, plugged in most of the time but able to disconnect and go at any time. My robowaifu would most likely be sitting 95% of the time and my place is so small she could get around while still being connected to power. It'll be an interesting problem navigating small spaces without knocking anything over and not burning excessive amounts of power to avoid stuff.
>>10181 >It'll be an interesting problem navigating small spaces without knocking anything over Yes, that definitely is going to require us to have an effective solution for at least two needs Anon: 1. Body-awareness. Distinct from Theory of Mind, this needs to encompass not only a detailed knowledge of her own physical volume & shape, but also a predictive methodology that can plan for her own kinematic/physics dynamics upcoming, based on her own short-term planned movements. 2. Excellent situational-awareness. She needs to have a detailed map of her environment at all times -- and one that can more or less instantly adapt to changes in it. Your own motions in the space for example, or your cat's. On a related point, until and unless we can actually give her a Theory of Mind, she'll have to rely on situational awareness (and possibly some types of crude heuristics) to be able figure out where you are/shortly will be. Thus (just one of the many) need(s) for a near-instantaneous observation capability.
>>10183 It would make sense to do this in a physics simulation. These were not only meant to train them, but to have one all the time for having awareness. Of course not simulating everything all the time, but when necessary. Also with simplified objects.
>>10186 Yes, that might be a good approach Anon, and for both of the enumerated needs too.
DrGuero made a new short video https://youtu.be/wxH3vQOz3JA, where he mentioned that his code is opensource now. His website http://ai2001.ifdef.jp is a bit slow right now, mb to much load after releasing the video. Also it's in Japanese, so some translation software will be necessary.
>>10343 Thanks Anon! Here's teh code page, I'm DLing now. http://ai2001.ifdef.jp/uvc/code.html >update OK, I've had a quick look at the code. It's old-style 'C with classes' (c.1985 C++) code. It seems to have only one major external dependency, Open Dynamics Engine. https://ode.org/ It's also W*ndows software. I'll try to sort out getting it to build on a modern C++ compiler on Linux sometime shortly, probably this coming week. Thanks Anon, he may be able to help our Bipedal Locomotion progress along.
>>10343 >>10344 update: OK, I've assembled the 4 code files into a workable project. I haven't set about trying to build it yet b/c dependencies. https://files.catbox.moe/wn95n1.7z However, if anyone would care to work on translating the embedded comments in the codefiles, then that would be a big help to me later on getting the code to run for us. TIA.
>>10344 >https://ode.org/ update By all appearances, the original ODE project has been abandoned. However, it's apparently continued under a new team. The project is now located at: https://bitbucket.org/odedevs/ode/src/master/ pamac info ode Name : ode Version : 0.16.1-1 Description : High performance library for simulating rigid body dynamics URL : https://bitbucket.org/odedevs/ode/ Licenses : BSD LGPL Repository : community Installed Size : 1.7 MB Packager : Antonio Rojas <arojas@archlinux.org> Build Date : 19/03/20 Install Date : 10/05/21 Install Reason : Explicitly installed Signatures : Yes I've gotten all the demos to build on Linux, so that's a good sign. I'll have to sort out all the dependency issues in the build file, but I'd think this project should be doable on Linux in the end. As is commonplace for me, I overestimated my abilities and it will be longer than I planned on. I'll post here when I have it all working. Probably be a week or two hopefully.
>>10409 Thanks, I would have helped already, but I have problems with my browser. Also there are a lot of comments, so I'll need to use sed or something like it. Great if you want to work on it, but bipedal walking isn't something very pressing. So don't worry about how long it takes.
>>10410 Thanks Anon, appreciated. I think r/n it's more of a thing to get working early, so that the AI geniuses here will have a readily-extensible tool they can use to work on connecting their AIs into, and coordinating a robowaifu's body movements, etc., via an early simulator environment. My personal goal for the software is to be able to continue refining it into an actual robowaifu runtime software, and for it to work onboard with small SBCs inside her. In my early tests thus far with ODE, it's looking pretty encouraging (~ sub-millisecond collision responses, etc.)
>related xpost (>>10601, pdf embed)
>>5409 Here the whole history of James Brutons bipedal walking robots, back to 2004 to around 2018. Robot X, shown in >>5409, was the end result. https://youtu.be/JWvH5PHKK74
>>10815 The real challenge isnt locomotion itself, its keeping the robot from falling over while moving. Though Boston Dynamics seems to have solved this issue. https://www.youtube.com/watch?v=fn3KWM1kuAw
>>11911 Everyone knows that they've solved it. Doesn't mean that we know how exactly or that we could replicate the same technique. You're missing the point. Bruton showed, what we can do as smaller developers or hobbyists. He didn't even study robotics. Also, he didn't overthink it, like some here tend to do. He just went on by try and error.
>>11911 >The real challenge isnt locomotion itself, its keeping the robot from falling over while moving. There's the rub, isn't it? >Though Boston Dynamics seems to have solved this issue. Now if only they'd just release their code open-source we'd all be good to go! :^) In the meantime, a Mobile Inverted Pendulum approach from UCSD has managed some verifiably-functional basic results (and we have direct access to the C code itself). Much more work needs to be done to mold it into human-like bipedal locomotion system suitable for a household robowaifu (>>7824). https://www.ucsdrobotics.org/mips https://github.com/beagleboard/librobotcontrol/blob/master/examples/src/rc_balance.c https://en.wikipedia.org/wiki/Inverted_pendulum
OK, so anon's OSRM project has gotten me back to thinking about light+strong robowaifu skeletons again. On the topic of bipedal locomotion a thought occurred to me I want to put out here: Since extra-dense, high-mass items like batteries are proportionally a higher percent of the total mass for a strawgirl or osrm robowaifu, wouldn't it stand to reason that mass could be articulated into a useful counterweight system within the lower torso/pelvis volume to swing back and forth as a type of counter-balance for the overall system during natural-gait walking/running/etc. ? The fact the mass is a higher percentage means it should be a particularly effective approach for either of these two types of robowaifus, but I imagine that it would be helpful even in a heavier robowaifu too. The basic idea is that you fix the batteries into an encasement attached to a short, inverted pendulum (inverted = batteries higher than pivot point). As the robowaifu's hips swing in one direction during her gait, the counterbalance pivots in the opposite direction. Animals already do this kind of thing instinctively during locomotion, and humans do as well. Our neuronal-musculo-skeletal complex all get wired up for complex kinematic motions such as these as we grow. By the time we're 5 or 6 yo, normally we pretty much have the 'force/counter-force' coordination down pat. Some of you may be aware that basically all skyscrapers constructed in any modern city have very massive 'floating' counterweights right at the top of the building that move this way and that depending on wind forces (and even earthquakes). A counterforce mass in our robowaifu's belly can serve a similar function and keep her center of gravity centered even when she's walking around. That is all.
Open file (47.51 KB 480x640 mr1.jpg)
Open file (78.13 KB 922x1396 images89.jpg)
>>12562 >Another Anon that actually understands physics and the importance of mass. Based and mass-shift pilled Check out Dr.Guero's work in this field. (Picrel) http://ai2001.ifdef.jp/mr1/mr1.html Femisapien also used this property. You can see her mass shifting point, it's the rotary encoder on her crotch. She shifts the mass of her body and electronics onto one leg, then moves the other leg forward as her weighted leg moves back. All you need is a parallel mechanism to keep her feet level and an added steering servo to twist her thighs for turning. Please do post any work you make even failures, I have a thread filled with my failures.
>>12582 Will do anon, thanks for link.
how soon we will have a bipedal robot that can imitate humans
>>13439 when the geniuses stop focusing on just the legs and finally figure out you cant have bipedal motion without fucking ears and start using gyroscopes
>>13447 How would you implement them?
>>13451 same way the human body does a feedback loop making continuous micro adjustments autopilots already do this with stabilizers, but thats easy for something with a plane perpendicular to gravity, parallel planes are in a league of their so dont bother until synthetic musclefibers become a thing
> (>>16593 - information & videos -related)
>Energy optimization during walking involves implicit processing [1] >Gait adaptations, in response to novel environments, devices or changes to the body, can be driven by the continuous optimization of energy expenditure. However, whether energy optimization involves implicit processing (occurring automatically and with minimal cognitive attention), explicit processing (occurring consciously with an attention-demanding strategy) or both in combination remains unclear. Here, we used a dual-task paradigm to probe the contributions of implicit and explicit processes in energy optimization during walking. To create our primary energy optimization task, we used lower-limb exoskeletons to shift people's energetically optimal step frequency to frequencies lower than normally preferred. Our secondary task, designed to draw explicit attention from the optimization task, was an auditory tone discrimination task. We found that adding this secondary task did not prevent energy optimization during walking; participants in our dual-task experiment adapted their step frequency toward the optima by an amount and at a rate similar to participants in our previous single-task experiment. We also found that performance on the tone discrimination task did not worsen when participants were adapting toward energy optima; accuracy scores and reaction times remained unchanged when the exoskeleton altered the energy optimal gaits. Survey responses suggest that dual-task participants were largely unaware of the changes they made to their gait during adaptation, whereas single-task participants were more aware of their gait changes yet did not leverage this explicit awareness to improve gait adaptation. Collectively, our results suggest that energy optimization involves implicit processing, allowing attentional resources to be directed toward other cognitive and motor objectives during walking. >Humans Are Designed to Think While Walking [2] >One of my favorite things to do when I am on vacation is hike in the mountains and take in as much scenery and contact with wildlife as possible. The former requires that I stay in good enough physical conditioning that I can achieve 15+ miles of mountain hiking per day. Therefore, when I am not on vacation, I go for a two-to-four-mile run every day before breakfast. That morning routine keeps me in physical shape and prepares me to undertake the research and writing projects for that day. Our problems are much, much simpler than God's were when He was designing us human beings with all our facilities, including these two. However, I'd say it's a good model for us to follow. After all, robowaifus should be able to talk with us about different things; like being young newlyweds while she's cooking a meal for us upstairs at the pub, right Anon? :^) Maybe Carver Mead's (et al) Neuromorphics can help us all out with this a bit. To wit: push the computation out to the edges of a [robowaifu's] system[s]. That way, while the 'autonomous' things are happening, her central-core computation mesh can be freed up to talk with us about important things. 1. https://pubmed.ncbi.nlm.nih.gov/34521117/ 2. https://reasons.org/explore/blogs/todays-new-reason-to-believe/humans-are-designed-to-think-while-walking >=== -minor fmt edit -add 'important things' cmnt
Edited last time by Chobitsu on 07/25/2022 (Mon) 22:07:22.
I think this general approach should be applicable, it is validated in real-life swiss quadruped robot [1] and in simulated [2] & real bipedals [3]. Compared to less-validated approaches it's a clear winner. You don't have to implement it in main AI, it's better if it runs on a small low-latency auxilary NN. It doesn't require too much compute or data, and the gait can be tuned via adding energy expenditure & smoothness terms to the loss. You can also include mocap data and tune the model on it for humanlike gait. 1. https://leggedrobotics.github.io/rl-blindloco/ https://www.youtube.com/watch?v=8sO7VS3q8d0 2. https://www.youtube.com/watch?v=hx_bgoTF7bs 3. https://techxplore.com/news/2021-04-robot.html
>crosslink-related (>>17989)
>(conversation-related >>18421, ...)
> (crosslink related : >>20777) Primitive 'walking' smol android.
>>17064 >Learning Quadrupedal Locomotion over Challenging Terrain These are great lengths. It got me thinking about something I would love to have besides a robot waifu. A ostrich runner I could ride on like a hirse. I would set it up to have a roll cage on top and have a body harness inside the cage because you know the thing will trip every so often and with a harness it wouldn;t be a problem. ANother addition could be a front extra leg. "If" it starts to trip, and it will eventually, it sticks out the front leg and rolls the robot to the side. That way you don;t face plant but sort of roll into the fall. So I look up trying to find how much energy a ostrich uses to run. I can't imagine it's as much as a horse(746watts) because they are heavier. I found a link and a set of papers on ostriches. This lady did a thesis on them running. https://www.scienceinschool.org/article/2011/ostrich/ and two papers. One one the ostrich and one on the energy needed for Men, horses and ostriches to walk.
Only got one paper here's the other Correction above "these are great LINKS..." and damn my typing is so bad and this little box I can't see my mistakes. Sigh.
Here's some thinking out loud about walking and the steps needed to program this or "a" numerical structural system to make this work. A strategy. So as I walk about, I started thinking about what I'm doing. It appears, to me, that walking is sort of a natural preprogrammed act where the muscles mostly know what to do. Brain tells body go here, go there, it goes here, goes there. So I started thinking about how we could do the same. I came up the idea that with small amounts of data passed to the muscles we could get good movement. My assumptions are we are using micro-controllers to control each muscle, with some extra processing power for each muscle. The first step is the brain makes a "map" of the terrain in front of it, what speed it is going at and then where it needs to step for that speed or terrain. I watched a Jim Keller interview with Lex Fridman and Keller said that this sort of distance measurement was trivial. In other words using two eyes you can tell where something is like the floor and how far away. He should know, he's a legend at AMD and Tesla, designing chips for...everything. So let's take his word for this, (yes there are likely to be complications but let's ignore for now). We will assume finding a place to step for the waifu is not big deal. A simple table of sorts could be built up of how far a step is depending on speed it wants to go and using that it maps where a good place to step is. So now we have that, then, we need a vector to this place. Now a leg, thigh, foot, movement is going to be constrained to how they can bend so if the brain sends a start moving vector(a angle for the foot to move AND a velocity), a end point to go to, then this set of vectors from the brain could be interpreted by ALL the muscles in the leg. Each one knows that to move here or there, it must act in a certain way. So with this one vector based on, move this way x, this way Y and this way Z, plus a speed, plus the end point, each muscle can work on its on to add up to this end point. So you have, (Each 2 bytes(16 bits) giving us 65,536 different values or postions) A starting direction for foot movement, x,y and z. If you need to step over something it will have a high value for "Y". It will pick the foot up high.(6 bytes) A velocity point for x,y and z, how fast to move(two bytes x, two bytes Y and two bytes z) (6 bytes) End point (two bytes x, two bytes Y and two bytes z) (6 bytes) and a foot position x and y for how the foot needs to land on the ground.(4 bytes) (22 bytes total) This will create a vector that tells the foot, leg, thigh, hip how far it is to lift the foot, how fast and what direction x,y,z. This data is really just where the foot moves. What direction and how fast to move in that direction, where it is to move the foot to and the position of the foot when it lands. All the muscles have to do a certain task to do this and they "know" what to do, they do their part by only feeding them foot movement data. Another thing I think that would be needed to give the waifu a real strong grasp on what to do, is to feed the muscles a body position based on the body mass point at the shoulders and the hip AND a velocity vector x,y and z of what the hips and shoulders are doing in respect to movement. I think these two measurements could really add a lot to it's position awareness with a small amount of data being passed. In this case. shoulder x,y,z position(6 bytes) shoulder x,y,z velocity(6 bytes) hip x,y,z position(6 bytes) hip x,y,z velocity(6 bytes) (two feet, 40 bytes x 2, so 80 bytes total)for feet and body in all. At 50KBs with 80 bytes, let's add in some error correction and maybe some other stuff so make it 500 bytes just to throw a number at it and you could do this a 100 times a second or every hundredth of a second. That's fairly good at a really slow network speed, though I would like to see the time interval smaller and kick the transfer speed up a little. It worth noting that if each processor can figure out it's moves on this small set of data it doesn't need continuous data streams. Only one set for each step. The walking models, that say they are successful, say they add refinements for the last part of the step so you could send a start and then small refinements as it continues through the step with way, way, less data than 500 bytes every 1/100 of a second. There's also two paths that could be followed. The main brain could figure out the proper step pattern for a specific body position and just send the 22 bytes foot movement step, velocity vector and end point to the muscles to walk. With each muscles processor figuring the actuation needed to preform that specific movement. Or it may be that some combination that changes depending on circumstance is used. Where sometimes the muscles get body data, (the shoulder, hip), positions and sometimes they just get foot movement data. I was pondering this and just decided to jot down what I was thinking.
A good link that might later become valuable to train waifus. The WILDTRACK Seven-Camera HD Dataset (repack) This is a dataset of people walking through a square recorded on camera. If analyzed for joint movement and then programmed into the waifu it could show it what it needs to do to walk. Difficult, but a good resource if you know what to do with it. 60GB large but is a torrent file so easy to download. https://academictorrents.com/details/6d5542b0d245ff4d37680f67f2fb96750e6d8c60
>>21602 This is really great & interesting stuff Grommet. I hope to have a long conversation with you about such details before too long. Cheers! :^)
Here's an old concept that's still pertinent for us today, /robowaifu/ : >ZERO-MOMENT POINT — THIRTY FIVE YEARS OF ITS LIFE abstract: >This paper is devoted to the permanence of the concept of Zero-Moment Point, widely-known by the acronym ZMP. Thirty-five years have elapsed since its implicit presentation (actually before being named ZMP) to the scientific community and thirty-three years since it was explicitly introduced and clearly elaborated, initially in the leading journals published in English. Its first practical demonstration took place in Japan in 1984, at Waseda University, Laboratory of Ichiro Kato, in the first dynamically balanced robot WL-10RD of the robotic family WABOT. The paper gives an in-depth discussion of source results concerning ZMP, paying particular attention to some delicate issues that may lead to confusion if this method is applied in a mechanistic manner onto irregular cases of artificial gait, i.e. in the case of loss of dynamic balance of a humanoid robot. >After a short survey of the history of the origin of ZMP a very detailed elaboration of ZMP notion is given, with a special review concerning “boundary cases” when the ZMP is close to the edge of the support polygon and “fictious cases” when the ZMP should be outside the support polygon. In addition, the difference between ZMP and the center of pressure is pointed out. Finally, some unresolved or insufficiently treated phenomena that may yield a significant improvement in robot performance are considered.
>OpenWalker >Framework Architecture >1.1 ProblemStatement >Biped and humanoid robots are complex systems composed by a large number of actuated degrees of freedom and various sensors that provide the requirements for the complex process of walking. Due to such complexity and number of components, in the past decades, the community of humanoid robots has developed several robot architectures for different purposes and applications. All these robot architectures, ranging from small-size hobby robots to giant manned walking machines, share some anatomical similarities developed through years of work in this field. Nevertheless, the task of coordinating the information generated by all the sensors and actuators in humanoid robots poses a complex software problem which has been approached with different paradigms. Consequently, a large variety of different software frameworks has been created which, in most of the cases, are intrinsically connected to the robot hardware that they were designed for. This hardware dependency makes it difficult to develop software packages which can be reused by other teams. As a result, the robot developers must continuously re-implement essential software modules/components to comply with the requirements and restrictions of their system, slowing down the progress of biped and humanoid robotics. In particular, two of the most essential components required by this type of robots are the balance and walking controllers. Software will be the key to solving robowaifu bipedal locomotion. Everything is an interdependent-system of course, but without good software, our robowaifu's hips, legs, & feet are simply mannequins.
I suggest researching prosthetics. They are the field for bipedal movement eith the highest research. The most annoying part is the knees and ankles.
>>21776 Great advice Anon, thanks! The OP opener image ITT is a high-tech prosthetic leg.
>>21810 Yeah I noticed. That leg is the prlsthetic that started me on this resesrch rabbit hole. That and the prosthetic a guy did for his missing fingers. 100% mechanical. Wich is impressive. Some stuff I think can be done mechanically that don't need actuators. Atleast that is my theory.
>>21811 >Some stuff I think can be done mechanically that don't need actuators. Atleast that is my theory. That's actually a very solid engineering approach to problems in general, Anon. Anything we can do/use to simplify the systems we're working with is a big plus for us in numerous ways.
>>21813 Been thinking about this a lot. I wonder if mechanical systems like windup clocks or gravity feed systems can be used. They have an input output binary like in digital systems that might help bring cost down possibly.
>>21849 Just like all engineering, it's a series of choices each involving tradeoffs. We've basically 'coalesced' around batteries + BLDC actuator systems because they--in effect--bring the greatest benefits for the least costs. That being said, once the era of robowaifus begins, then I doubt not that many very clever men will find ways to further-optimize her systems, which very likely may include numerous mechanical contrivances. However it goes, it's going to be a highly-interedasting field to come! :^)
>>21811 Technically, bipedal motion can be defined by hip and trunk actuation. Humans walk as pendulums which use our muscles to pump the oscillation. Researching passive dynamic walking will help you. >>21813 Indeed, one thing that every engineer learns through every success and failure is simplicity is the absolute king. >>21849 You're well on your way with that way of thought. Human walking utilizes Both gravity and stored potential energy through our elastic ligaments, which can be thought of as a windup mechanism. Using clever designs and elastic elements like springs and rubber bands is actually essential for approaching human walking efficiency. Picrel is an old walking mechanism that relies on gravity for power with compelety passive knees preventing the feet from tripping it over. There are many flaws I can see at a glance, for a fun activity, try annotating the design with how you'd improve it. Analysis for the sake of improvement can be a valuable way to quickly gain deeper understanding. An easy hint is that humans move our chests side to side while walking. Why waste calories moving a large mass? How does this lower overall caloric use in walking? >>21852 Watching this field has been enlightening and interesting indeed. Picrel 3 was what got me into the field many years ago. For more information on it: http://ai2001.ifdef.jp/mr1/mr1.html What brought you into the field?
Great video for learning from example https://yewtu.be/watch?v=nVC9D9fRyNU
>>21859 >Picrel 3 was what got me into the field many years ago. Interesting >For more information on it: >http://ai2001.ifdef.jp/mr1/mr1.html Thanks! >What brought you into the field? I've always wanted a robot friend since boyhood, plus AI was an interesting topic as well. But as an adult, it's the concern for the welfare millions of men abused by the Satanic machinations of the Globohomo that drives this passion regarding robowaifus. (>>14500) And very-specifically, it's for Anon that I'm working towards this robowaifu goal first and foremost. Normalfags already have metric boatloads of support systems haha. :^) But yeah, I always thought robots were cool long before I thought much about the waifu aspect of it.
>>21753 >ZERO-MOMENT POINT Yes really helpful!
>Learning Agile Soccer Skills for a Bipedal Robot with Deep Reinforcement Learning This study investigates the use of deep reinforcement learning to train a miniature humanoid robot with 20 actuated joints to play a simplified 1v1 soccer game. The robot learned dynamic movement skills such as walking, turning, kicking, and fall recovery, as well as basic strategic understanding of the game. The agents were trained in simulation and transferred to real robots with minor hardware modifications and basic regularization of behavior during training. The resulting policy exhibited safe and effective movements while still being dynamic and agile, outperforming a scripted baseline in terms of speed and efficiency. Paper: https://arxiv.org/abs/2304.13653 Website: https://sites.google.com/view/op3-soccer
>>22224 Neat! Thanks RobowaifuChat-Anon, and welcome!
>>22224 Great, I hope we can do that when we need it. Mujoco: https://mujoco.org/
Open file (94.81 KB 662x883 FreaG003lc.jpg)
I'm going to repost this from the short stack thread: http://davidbuckley.net/DB/index.htm This guy made a bunch of simple bipedal walker robots that might be useful for early robowaifus.
>>22542 Very cool, thank you Anon. Kinda reminds me of some of the droids from St*rwars.
>>22224 stop bullying him!!! D:
>>22688 Excellent find Anon! Looks very promising. They've managed to capture some fairly realistic human-like motions for a prototype. I'll be interested to see more technical information from this research group in the future. Thanks Noidodev. Cheers. :^)
Open file (539.05 KB 768x432 Strandbeest.jpg)
I know there is a mechanical thread , but what if the solution of the affordable robowaifu is in the Jansen walking mechanism? https://www.youtube.com/watch?v=C7FMIRfP1tk Changing the length of the links dynamically probably and finding the correct set of nodes could lead to something functional and aesthetic.
>I know there is a mechanical thread We even have one for monster waifus: >>10259 - good luck...
>>22728 I'm going to merge this thread soon into our Bipedal Locomotion thread, OP. Jansen mechanisms (and similar ones) have been discussed here on /robowaifu/ before. Allow me to agree with Noidodev, and wish you good luck Anon! Cheers. :^)
James Bruton showing the basics of bipedal robots and the challenges in another example: https://youtu.be/WJKhpGFg4uU - not really close to a human but maybe educational.
Personally I'd like to break this down from most doable to hardest. Bipedal motion is probably hardest and we should focus on AI last just saying.
>>22786 Good idea Anon. Mind breaking down our big problems into all the 'little problems' you can think of, then posting them here as a list, ranked "Easiest to Hardest" ? TIA.
>>22786 Totally agree, but still I keep an eye on it. If I see videos about how to archive it and learn about it, I post them. Bipedal walking itself can be broken down into smaller pieces, and some simple walking with guidance from a human would already make movement from place to place easier.
>>22787 Here's my guess from easiest to hardest robot skin robot hands robot head robot balance robot mouth robot arms robot legs walking But that's just off the top of my head based on my guesstimates just now...
>>22808 Thanks Anon! If you ever take the time to break this list down methodically, I suggest posting it to our How to solve it thread.
>>22814 I just asked bard to arrange it Here is the list of the items from easiest to hardest to do: Robot skin Robot hands Robot arms Robot legs Robot head Robot balance Robot mouth Walking Robot skin is the easiest to do because it is simply a covering for the robot's body. Robot hands are also relatively easy to do, as they can be made from a variety of materials and can be designed to perform a variety of tasks. Robot arms are also relatively easy to do, as they can be made from a variety of materials and can be designed to reach and grasp objects. Robot legs are also relatively easy to do, as they can be made from a variety of materials and can be designed to walk and run. Robot head is more difficult to do than the other items on the list, as it needs to be able to perform a variety of tasks, such as seeing, hearing, and speaking. Robot balance is also more difficult to do than the other items on the list, as it needs to be able to keep the robot upright and stable. Robot mouth is also more difficult to do than the other items on the list, as it needs to be able to perform a variety of tasks, such as eating, drinking, and speaking. Walking is the most difficult item on the list, as it requires the robot to be able to coordinate its movements in order to move forward. It is important to note that this is just a general list, and the difficulty of creating a robot will vary depending on the specific design and capabilities of the robot. The almighty AI has spoken.
>>22819 lol I just reread that eating and drinking wtf
>>22820 okay got a completely different answer from chatgpt Arranging these tasks from easiest to hardest: 1. Robot skin: Adding a protective or decorative covering to the robot's body is relatively straightforward and doesn't involve complex mechanics or functionality. 2. Robot mouth: Constructing a robot mouth to mimic basic movements, such as opening and closing, can be achieved through simple mechanisms like servos or pneumatic actuators. 3. Robot hands: Designing and building robot hands that can perform basic grasping and manipulation tasks requires more complexity and precision, but it's a well-explored area in robotics. 4. Robot arms: Developing robot arms involves creating articulated structures with joints and implementing motorized actuators for movement. It requires a higher level of mechanical design and control. 5. Robot legs: Designing and implementing robot legs for walking or locomotion involves complex mechanisms for stability, balance, and coordinated movement. It requires advanced control algorithms and sensor integration. 6. Robot head: Building a robot head involves integrating various components like cameras, sensors, actuators for facial expressions, and potentially speech synthesis capabilities. It requires mechanical design, electronics, and programming skills. 7. Robot balance: Achieving stable and dynamic balance for a robot is a significant challenge. It involves complex control algorithms, sensor fusion, and precise actuator control to maintain equilibrium and adjust to external forces. 8. Walking: Developing a walking robot is one of the most challenging tasks. It requires a combination of mechanical design, kinematics, motion planning, and control algorithms. Achieving stable and efficient locomotion across different terrains demands advanced techniques like dynamic balance and gait optimization. Please note that the difficulty of each task can vary based on the specific requirements and the level of complexity desired. This ranking provides a general idea of the relative complexity involved. chatgpt>>>>>>>>>>>>>>>>>bard
>>22821 one last thing we really should give that anon that made the robot head some props considering its so up there. Hope he didn't leave.
>>22808 Thanks, but we have a whole thread with making overviews, planning, diagrams, and subtthread does something quite similar what you're doing here: >>9554 - breaking down the development into smaller steps. Also, please consider the LLMs are likely not better than his board in knowing things in regards to robowaifus, since sadly we're somewhat the experts on this. On top of that, they're libtard censorship challenged, which means they might not know a robowaifu is supposed to be something like woman, what a good looking women is supposed to look like, and so on.
here's something I've thought up. It'd be rough and rigid but it'd get the job done I think.
>>23177 Please work a bit more on your ideas before you post them, and add a description.
>>23180 omg dude chill. What I just posted is fine. Clearly you never played with mechanix as a kid. Its not ideal but ideal will never see the light of day.
>>23180 Do you WANT this board to die? I get the drawing is completely childish, but your post is completely snobbish. Encourage the newcomers ffs.
>>23229 Adults often times draw on napkins to express their ideas too ehem
Related linkage mechanism, not only for walking >>24791
> HRC Model 3 Bipedal Robot IK Test by Letic Z https://youtu.be/i6j0hg5ZIPE
>>26403 Neat! Thanks NoidoDev.
Open file (78.13 KB 922x1396 Femisapien-a21.JPG)
Attached is an inside view of the "Femisapien" toy from Wowee in the mid 2000s. The leg mechanism uses only 3 motors, with the ankle synced to the hip via bowden cable and hip linked to knee via parallelogram. 1 motor to move hips side/side, and one motor per leg to move them forward/back. Demo Vid: https://www.youtube.com/watch?v=rNezxoSEEB0
>>26432 Mark Tilden is a genius tbh. (>>10257) There's a concept called Neuromorphics. It's basic tenets have in some cases been solved in large degree simply by clever designs using standard analog electronics. We can do with a bit more of that style of insight here on /robowaifu/ IMO. Cheers. :^)
Open file (3.06 MB 1844x2792 0Femisapien.png)
>>26432 Femisapien is even simpler/stupider. That's just a floating steel rod,not a bowden tube. There's nothing that moves her hips. She is underactuated, relying on her bodies dynamics to move her hips. Those springs help to reduce needed energy to shift her upper bodies mass over to one leg. This is achieved by throwing mass (her arms) to alter her upper bodies center of gravity, which causes her to tip with the help of those springs. It's a great demonstration of Keep It Simple Stupid, reducing costs to an absolute minimum by using dynamics to simulate the functions of servos that aren't there. Her arms being able to turn her elbows, move the arm up and down, inside and out, all with one motor and potentiometer each, is also clever. Distribution of mass to help even out the torque needed to move her legs across their arc, and attention to her periods as a system of pendulums is worthy of note.My all time favorite biped. There's so much to love about her design.
>>26502 >She is underactuated Clever design work. After all, the simplest, most robust parts are the ones that aren't there! :^) Same is even moreso when it comes to software. As mentioned here (>>26447), if we can use simple analog electronics parts to fill in for what otherwise would need complex software to achieve, the parts attributes above can be expanded to include faster as well. Thanks for the nice breakdown analysis Kiwi. I wish the two of you had physical copies of Femisapien so you could really do proper teardowns/reverses. Cheers. :^)
Human Moves, Robotic Grooves: How AI Mimics Human Motion in Stunning Detail https://youtu.be/3LkWydJq6y0
>>26503 I do have a femisapien. Had to pop it open to fix the arms. No, not going to open 'er up again unless necessary. If I did the husbando Joebot would get lonely. Femisapien is the peak of the Wowee toy line when it comes to leg movement-of course RS media takes the cake for customization (which I also have). Everything after -like her husbando Joebot- is downhill. I think I got the order right idk)
I was looking at G-code for 3D printers and found this page "alternatives to g-code". https://reprap.org/wiki/Firmware/Alternative#alternatives_to_G-code In it they describe something I was groping towards here and in subsequent comments. >>21602 They say that, "... Much of the ""Firmware - experimental" RepRap discussion forum discusses protocols that are hopefully easier to interpret than G-code. splines and circles: By describing every move in the form of curved cubic splines, not only do we reduce the volume of data required to describe a curved part, but it may also make the microcontroller firmware simpler and faster. Faster: only integer arithmetic required. Simpler: the firmware only needs a single subroutine to handle cubic splines, rather than several subroutines to handle straight lines, circles, arcs, and the involute curve used for twice for each gear-tooth..." Curved cubic splines! And about them, "...Less volume: 8 cubic splines per circle are more than adequate for high precision CAD/CAM machines[5], while apparently (?) the current RepRapGCodes requires hundreds of tiny straight line segments to approximate the same circle. Elegant multispline motion controller and RepOlaRap and Repic5D mention such splines. Elegant multispline motion controller "will not use G-code. It will use a custom language based on cubic Bezier curves. This allows for much better description of arcs and will result in much higher quality prints with a much lower data throughput requirements."..." Some others. Any of you ever heard of Don Lancaster? He did articles on electronics and postscript. I think he came up with the "flutterwumpers", "... Flutterwumper Library generally focuses on generating shapes in PostScript format, and then converting directly from PostScript to "step, direction" pulses, without ever going through G-code. ".gcode versus .s3g" discusses the ".s3g" format file produced by replicatorG. It also mentions that USB seems to have high latency..." and etc. This is some good basic, foundational stuff. At some point all of this will have to be used in some form. That the high precision CAD/CAM machine use 8 cubic splines per circle tells you that it's likely it's close to optimal. They have thought deeply about this and all these machines started life severely compute constrained compared to even low power microcontrollers of today. So at the very least it should be close to being adequate for anything we could come up with. The tool heads of these use the same sort of 3D motion that I talked about for programming movement for limbs and body parts.
Search down on this page for a lot of references to "robotics". https://www.tinaja.com/flut01.shtml#hexapods Don Lancasters site. He has a ton of stuff on there that's related to robotics and control of.
Open file (101.03 KB 750x483 ezgif-3-84f76604c8.jpg)
I'd be willing to stick wheels under its feet. That's not a problem for me. What I'm not willing to compromise is in its ability to squat. I also do not want to spend thousands of dollars on actuators. I assume what matters in this case is the knees movements rather than the hip. Thoughts?
>>27887 So this is very crude but like the say. Perfection is the enemy of progress...
>>27892 Did a 3d print to illustrate the point. Should of made it bigger... Im going to have to make it bigger and do the whole thing in a small scale i think.
>>27903 Good ratchet design, Anon. Do you think you'll go ahead and finish it up with a proper housing too? Good luck with your project! Cheers. :^)
I changed my mind, i'm thinking this one. The can be spread apart on the sphere. The ms paint drawing didn't get the point across so here you go.
>>28255 I think id make the gears the same size. I don't know why i made the main one larger really.
>>28256 Okay so that leaves the problem of one leg being up at all times. However. I'm going to keep the solution a secret maybe. Unless you guys can find the solution.
>>28258 okay no i got it. Add that to this. The sphere is placeholder for how to make the legs stretch out.
Open file (13.59 MB 1280x720 wPvuTsirGEnHmXsInBr9j.mp4)
Open file (51.95 KB 612x442 agile but safe.png)
This paper focuses on a quadrupedal robot but the findings are also applicable to bipedal robots. Just imagine what will be possible two papers down the line. Agile But Safe: Learning Collision-Free High-Speed Legged Locomotion >Legged robots navigating cluttered environments must be jointly agile for efficient task execution and safe to avoid collisions with obstacles or humans. Existing studies either develop conservative controllers (< 1.0 m/s) to ensure safety, or focus on agility without considering potentially fatal collisions. >This paper introduces Agile But Safe (ABS), a learning-based control framework that enables agile and collision-free locomotion for quadrupedal robots. ABS involves an agile policy to execute agile motor skills amidst obstacles and a recovery policy to prevent failures, collaboratively achieving high-speed and collision-free navigation. >The policy switch in ABS is governed by a learned control-theoretic reach-avoid value network, which also guides the recovery policy as an objective function, thereby safeguarding the robot in a closed loop. The training process involves the learning of the agile policy, the reach-avoid value network, the recovery policy, and an exteroception representation network, all in simulation. These trained modules can be directly deployed in the real world with onboard sensing and computation, leading to high-speed and collision-free navigation in confined indoor and outdoor spaces with both static and dynamic obstacles. Project page: https://agile-but-safe.github.io/ Paper: https://arxiv.org/abs/2401.17583 It's not 100% collision-free but their system can work on icy snow, bear a 12-kg payload (equal to its own weight) and withstand perturbations. It can reach a peak speed of 3.1m/s and an average speed of 1.5m/s in cluttered environments while switching back and forth between the agile and recovery policy. Some key takeaways: >For agility in collision avoidance, the best practice is to integrate locomotion and navigation rather than decoupling them as separate subtasks. >A low-dimensional exteroception representation can facilitate policy learning and generalization. >Their system can adjust reward weights to trade off agility and safety. And some other interesting bipedal locomotion papers cited in the paper: >Learning Vision-Based Bipedal Locomotion for Challenging Terrain https://arxiv.org/abs/2309.14594 Which inspired them to use a similar low-dimensional exteroception representation. >Robust and Versatile Bipedal Jumping Control through Reinforcement Learning https://arxiv.org/abs/2302.09450 Which was mentioned as one of many examples for robust locomotion using an RL-based controller.
>>29059 Remarkable. Two things I would notice about the video and the graph. * 1a. Clearly, the ABS achieves higher-performance metrics. * 1b. The tighter 'spread' of the data points on the graph, possibly (probably) indicates a sounder, more-efficient solution to the problemspace itself, when compared to the competition. * 2. The (common) design of the actuated legs is an exemplar of one of my key design opuses (>>28664), namely: > "The designer must keep the very dense elements (such as the biggest of the actuation motors & batteries) inboard within the torso framework." These doggobots do exactly that, and is a fundamental reason this general design approach is so popular now: because it works, and it's energy-efficient. --- >exteroception Lol, that's a new one on me even though I've had the exact idea in my head for probably two decades now. :^) Top-quality post Lad. Cheers. :^) POTD
Open file (1.06 MB 2218x3326 7997171500495697063.jpg)
>>29766 Thread related, an idea that didn't work very well.
>>9053 >Unfixed center of mass in design: as in "megaman" robots (and the Roll example), center of mass is drawn to the Boots, which themselves could be weighted further. (for "intimate" times the boots could be taken off to reveal more human-like feet, when sitting comfortably or lying down, etc). Somehow came up with this exact same idea yesterday or so, so I might as well voice my support for it here. Seems like it would make it easier for the robot to balance if the center of gravity was made closer to the feet, even if it makes the bot 20+ lbs heavier.
Dont really have robotics or simulation experience but cheat mode Black boots with one rubber wheel per foot to walk in a manner giving illusion of walking with feet very close to ground. This can be done with wheel mid foot for heel strike mimicking or forefoot for using the heels as a stable ground and do a forefoot walk with front wheels dragging along and heel raised. There is a few options for more on monster girl side of design such as a dragging dragon tail for extra backward support with or without a wheel in it or a monkey tail tripod leg that walks as a third smaller supportive limb There is disabled approach too such as not walking at all using a wheelchair or vehicle or walks with cane, walker or crutches. Embrace tradition reject bipedal and crawls on hands and kneeds going "nyan nyan nyan" or "wan wan wan wan" :3 bipedal but not quite human mimetic Gyroscopic balancing is an obvious option but it also will add weight and likely cant be entirely relied on but can at least help in some situations if there is room for it. Similar to OPs mention of a deer girl what about a harpy with backward bending bird legs? A bit creepy but kinda cute maybe? "bladeless" fans using air propulsion to aid in balancing. By bladeless I mean hiding the blades for safety reasons. This has obvious down side of increased noise and likely is not that energy efficient. human bipedal Four legged animals like cats, bears, dogs, raccoons can walk on two briefly and humans can hop on one briefly so any bipedal needs to be able to handle that so maybe look at monopods and just double the legs. Problem with this design is it will be a very hoppy walk which might look cute till your foot gets crushed. But even if not monopod based just keep in mind maintaining momentary balance on a single leg will help with walking smoothly. I watched a few vids of bipedal robots closely and i see a common flaw due to my exposure to the barefoot footwear crowd. Bipedal robots often have stiff boards for feet mimicking shoes not the feet humans evolved with and either walk with a heel strike or a mid foot strike in short steps. Heel strikes are often hard impact to the ground and midfoot strikes while you technically can do them the increased landing surface on smooth hard ground will be noisy too. I suggest although most people dont walk this way and all primates walk heel strike i say why not try a forefoot strike walk? Or at least do a forefoot strike for jogging or running speeds. It is second most popular method for runners and many cultures this is their main running style barefoot or thin sandals and the like. What this does is provide similar movement as backward bending knees just different proportions of joints and the fastest bipedal bots have that ostrich design. This also allows for sensors in the feet to adjust step length for say approaching stairs and to feel the ground for stability before stepping forward. This walking method is already required for walking down stairs anyway and logically it will improve downward incline walking. It may increase tripping or slipping risk for upward incline walking though so maybe adjust walking style for that situation. Feet soles could be fit with thin flexible grippy rubber like that used in bouldering shoes with small treads like boating shoes to avoid wet floor slipping. In theory this should make for a more dainty walk and less noise on impact because the ankle joint will be like a shock absorber. The design of the foot should have movement more like HyperLeg by IRIM Lab Koreatech not a flat plank. https://www.youtube.com/watch?v=wLFCMwRvhVI
>>30137 I probably should have started with this in reverse order I wrote but i wanted easiest to harder methods as order. Looks like I didnt realize the Hyperleg was already posted in here so I didnt need to link to YouTube when referencing it.
Crosslink >>31550 >Small robot project http://zlethic.com/hrc-model-0/
Interesting project. Discusses the general topic of balance. In brief. https://www.youtube.com/watch?v=z2qh1hENfq8 > ( via : >>33233 , thanks Anon! )
>>27069 Have to post in this thread, because I discovered a video of a Japanese bipedal robot that has toes :D At the tail-end of the vid, of course: https://www.youtube.com/watch?v=QHjd-O8S0nI
Potential method for developing legs https://www.youtube.com/watch?v=i1xM61WBOBQ
>>33868 Thanks, Mechnomancer! Hope you're having a relaxing break rn. >toes I think many robowaifu designers may overlook how important the the ankle/heel/ball/toe flexion-complex is to good ambulation for bipeds. Essential. Cheers, Anon. :^) >>33876 Antie FTW! I loved that part of the story too. :^) >line-loaded, camshaft-driven That's actually a really interesting concept, Anon. Thanks for pointing it out. I especially like the fact the guy realizes how important it is to keep the 'outboard' weight down, and focus the mass inside the bot's torso. Cheers, Anon! :^)
>>33898 >relaxing break rn Not really, burned out a power converter in the non-waifu/mek project and faffing around with chonky atv tanktreads. > many robowaifu designers may overlook how important the the ankle/heel/ball/toe flexion-complex is to good ambulation for bipeds The flexing of the toes helps form the equivalent of the femisampiens leg parallelogram than allows the robot to push forward without being flatfooted.
>>33902 >Not really, burned out a power converter in the non-waifu/mek project Oh no! :/ >and faffing around with chonky atv tanktreads. Sounds pretty cool tbh. >The flexing of the toes helps form the equivalent of the femisampiens leg parallelogram than allows the robot to push forward without being flatfooted. That's a cool way to think about it, Anon. I'd love to eventually offer a Steampunk Sadie robowaifu model/kit, that looked full-on authentic OG 1800's retro. :D
Open file (6.13 MB 320x212 treadoopslolz.gif)
>>33915 >Sounds pretty cool tbh. Yes and no. Yes in that I now have what looks like a miniature nasa shuttle mover. No in that I need to slap a buncha 3d printed parts on 'em to make the treads stay on their guide wheels (the yellow and orange bits). Without my mods probably would've been crappy to have on an ATV. Also need proper wiring techniques otherwise the splices heat up after a minute or so of driving lol. But figuring out how to digitally control a 2000lb motor is nice (never mind controlling 4 of them), because I can probably look into reciprocating walking mechanisms for lighter robots, like an owo version of zaphod daederik's steam powered man, or a spider-like platform for SPUD to ride around on.
>>33962 Dang, that looks too cool, Mechnomancer. Be careful on that thing, lol. >muh le 2000lb motor Lolwut? >or a spider-like platform for SPUD to ride around on. I'm going with the pony motif for dear robowaifu Sumomo-chan rn. It'll be her 'hoers', and she'll be it's rider. But even the pone will be able to climb up with her onto a wheeled zipper platform for scootin' around fast (for example, going with Master to the park for picnics).
>>33969 >muh le 2000lb motor >Lolwut? A 2000lb winch motor, each motor pulls 15A for a total of 180 watts.
>>33972 Oh, OK got it. Heh, I thought you were saying the motor weighed a ton. :^)
Thanks for all the YouTube links, but please add some some description when posting. Ideally more than one line. These videos have a text description on YouTube which can be copied. With ">" you can quote it, though make sure to put this in front of each paragraph when doing so. Cheers.

Report/Delete/Moderation Forms
Delete
Report