/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Server and LynxChan upgrade done. Canary update and a general address soon. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


/agdg/ 's Game Jam runs from 3/3 to 4/4 ! Join now and learn2code during this special month, Anon! :DD


Robotics sites, organizations and projects Robowaifu Technician 09/16/2019 (Mon) 04:21:24 No.268
There are a lot of robotics research and projects going on out there in the world, many of them with direct application for us here. Please contribute important, quality links you find that you think would help /robowaifu/.

Robot Operating System (ROS)
www.ros.org/
>[ROS] is a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms.
Gazebo Robotics Simulator
gazebosim.org/
>Robot simulation is an essential tool in every roboticist's toolbox. A well-designed simulator makes it possible to rapidly test algorithms, design robots, perform regression testing, and train AI system using realistic scenarios.
Robohub
robohub.org/
Robotics community and news aggregator
StackExchange has a robotics channel.
robotics.stackexchange.com/
Robot Zero One
Robotics & related electronics news and reviews.
robotzero.one/
Open file (150.35 KB 1499x694 0704234812578_13_ieee.jpg)
IEEE has a nice resource page for robots:

robots.ieee.org/robots/

It's written for normalfaggots, but it's interesting that there's someone out there looking at all the notable robots in the world and compiling a list.

There really needs to me more waifus in those pics.
I feel that this product is essentially a gamechanger; Japanese twitter exploded the other day. This robot mimics what robotics competition kits use, only in $500 kit form instead of thousands of dollars in machined metal parts. It puts a conventional hobby RC car to shame.

www.dji.com/robomaster-s1

What this product does essentially, is to make the First Person Shooter into a live genre. I've been wondering why a lot of home robot companies fail – such as Anki – one reason they fail is that the consumer gets bored easily. Imagine a videogame where the location is just one room – your room, and you can only interact or control one character. You'd play back all of the animation sequences, and that's it. It's more like the Extras menu when you finish an RPG and unlock all the viewable models rather than get a full experience. Or its like a basic demo level in a sex game that Illusion makes. But that's what most of the home robots with apps do, so they are very limited.

To enjoy a robot for a longer time, we need it like we need a full game experience… in this case, it can be used as a first person shooter.

So for a waifubot to survive, it needs a good market… as a toy in a real-world videogame-like application. It has to be Dual Use. China is a master in dual-usage… consumer products have military applications as well. In our case, we can make toys that can turn into sex robots. But we have to sell them as harmless toys to get that market.

If an FPV land drone tank is the equivalent of the FPS, then what is a legged Waifubot the equivalent of? How about a fighting game? Imagine a $500-$1000 waifubot kit that turns your house into an open world RPG with augmented reality enemies. Will that be the killer app not just for waifubots, but home robots in general? Some things to ponder.
>>546
That would certainly make a nice base for a mobile moebot anon, thanks.
>Personal Robotics Lab
personalrobotics.cs.washington.edu/

>Humanoid Robotics Laboratory at the Georgia Institute of Technology (defunct?)
www.golems.org/oldindex.html

>>268
Here's the forum and the wiki for the ROS project OP
discourse.ros.org/
wiki.ros.org/
>>546 >www.dji.com Turns out, they have an extensive stable of products /robowaifu/ might be interested in. I would definitely like to purchase one of these S1's and explore what we can learn from it here. https://www.youtube.com/watch?v=hoTK9CZBnaE https://www.dji.com/se/robomaster-s1/video-courses They have an easily-mastered, Scratch-like programming interface to this system as well. https://www.dji.com/se/robomaster-s1/programming-guide
A Poppy robot that learns to explore the environment and play: https://www.youtube.com/watch?v=NOLAwD4ZTW0 It's an old project but still quite interesting. >Intrinsically motivated spontaneous exploration is a key enabler of autonomous lifelong learning in human children. It enables the discovery and acquisition of large repertoires of skills through self-generation, self-selection, self-ordering and self-experimentation of learning goals. We present an algorithmic approach called Intrinsically Motivated Goal Exploration Processes (IMGEP) to enable similar properties of autonomous learning in machines. https://arxiv.org/abs/1708.02190 From the current context it picks a novel object then picks a goal (new outcome) for this object and executes an action to achieve its goal. It tries to predict the object's trajectory from this action and then updates its predictions. It also uses intrinsic rewards of the empirical improvement in solving the goals it generates and the novelty of unexpected outcomes.
>>9613 That's neat to see teams begin to do mashups with readily available systems. Definitely right up our alley here.
This seems to be a interesting comunity for robotists: https://synthiam.com/Community - with projects and their makers here: https://synthiam.com/Community/Robots?sort=2 - Just found it, so I can't say how good it really is.
Open file (63.27 KB 1200x675 00-top.jpg)
Open file (333.57 KB 642x447 premaid ai.png)
A Japanese company created a decent companion robot 5 years ago called Palmi. It remembers what you chat about, comments on unusual things and what you're doing, remembers peoples voices and faces, can have group conversations, and can walk around a bit. It cost $3000 though and didn't get much attention beyond documentaries and exhibitions before being discontinued. https://www.youtube.com/watch?v=xaesUaCTBlk https://www.youtube.com/watch?v=x3TIKwueRSU https://robots.dmm.com/robot/palmi They also made a dancing robot, Premaid AI, for $1500 and a bunch of others but all their robots are discontinued now. https://www.youtube.com/watch?v=avwJElBz4Cg https://www.youtube.com/watch?v=cGxvshwAqvE https://robots.dmm.com/robot/premaidai
>>10218 > but all their robots are discontinued now. Too bad, but thanks very much for bringing them up for us here Anon. Any idea what the scientists, designers, and engineers who worked on these projects are doing these days? Also (silly question since they were obviously commercial endeavors) any chance any of these system software or designs are accessible today?
Open file (121.17 KB 1024x768 pupper_cropped.jpeg)
Here's an overview by James Bruton on robot projects: https://youtu.be/XpkVhmbLVTo including humanoids.
Open file (155.95 KB 1280x720 making_kibo-chans_hair.jpg)
Someone posted a Kibo-chan pic on another board.
>>543 Better link and description: Robotics Stack Exchange is a question and answer site for professional robotic engineers, hobbyists, researchers and students. It only takes a minute to sign up: https://robotics.stackexchange.com/
>>19427 You don't like Reddit? I don't care: r/robotics For under 12's, teenagers, and hobbyists https://www.reddit.com/r/robotics/wiki/get_started Career advice: Studying robotics at university, postgraduate study, or changing careers to robotics. https://www.reddit.com/r/robotics/wiki/career To Do Advice for post grad, career change, professional roboticist. Content of an undergraduate robotics course needs major work. FAQ https://www.reddit.com/r/robotics/wiki/faq To Do All of the FAQ is unchanged for years, and needs an overhaul. Killer robots are becoming increasingly prevalent, though a terminator scenario is unlikely. Will robots take our jobs is a more difficult question. Resources Online courses, books, competitions, software, parts suppliers, additional resources, youtube channels https://www.reddit.com/r/robotics/wiki/resources (To Do Further additions need to be made to development kits & software.) AMA's -> https://www.reddit.com/r/robotics/wiki/ama >Kyle Edelberg, algorithms developer for >Robosimian, NASA-JPL's entry in the DARPA Robotics Challenge >Josh Whitley, Lead Support Engineer for >AutonomouStuff, LLC, who make self driving cars >Andrew Alter, principal engineer at Trossen Robotics >VALOR, a DARPA Robotics Challenge team >Robohub, a non-profit online communication platform >Dan, of Marginally clever robotics.
RoboFoundry https://robofoundry.medium.com/ Seems to be pretty knowledgeable about some practical basics.
amoov is a good one but it has a non commercial license and myrobot lab on github. Forget about the batch file just open it on netbeans and compile it there, its less of a hassle cause the java website wouldn't allow me to install jdk 11. Well I'm compiling it right now it remains to be seen if it'll work but we'll see. https://github.com/MyRobotLab/myrobotlabhttps://github.com/MyRobotLab/myrobotlab and inmoov https://github.com/MyRobotLab/InMoov
>>23133 It looks like a lot of work went into myrobotlab... I managed to one example running but overall it seems kind of clunky which is kind of a shame. Maybe they should have gone with something else other than java... I don't know about the performance requirements of that thing but it looks like react could have done the job really. Maybe react+electron. But its hard to tell cause again I've only gotten one of the examples to run and its kind of buggy. Funny how the guy was so dismissive on the discord when I said I was a front end developer really.
>>23133 >>23135 Thanks for the recommendations Anon.
>>23140 sorry for the delay i should be asleep really haha anyways. I tried to run myrobotlab but really it does have a lot of bugs, but I might try a different branch tomorrow.
>>23133 Thanks, but some of your links are botched, though then again the site is a mess anyways. You also didn't explain what it is. I guess we have to look into YouTube explainers to find out what this is actually about: - Gestures: https://youtu.be/719fm9k1K7k - Speech: https://youtu.be/_rBPs7lFc0U - Connection two oher software like Google Assistant - Arduino and servo control https://youtu.be/AJasOtkCbro So it seems to be like an OS for InMoov and other such bots. Didn't watch the vids so I don't know more. Maybe somemparts are useful. Reminds me in that regard of some AIML based chatbot platform which name I forgot.
>>23178 It gives examples to various things and imoov https://inmoov.fr/ explains how to build a robot from start to finish. I can tell a lot of work went into myrobotlab but I don't quiet like it based on my limited use and the devs didn't make it painless to set up. You need to run a batch/bash file from the terminal or a jar file that doesn't work. But inmoov is very interesting.
>>23155 >but I might try a different branch tomorrow. No rush Anon. You're already doing good personal research simply by investigating the platform. I'm sure you'll find a reasonable approach if you keep moving forward! Cheers. :^)
>>23201 I went to the discord and they told me the stable branch was the develop branch eh. I don't think I'll be using myrobotlab I think... I want to use the resources in inmoov but its license is non commercial and I don't know why you guys don't see the potential to make money off of this but I do. and if all else fails the patent could still be sold...
>>23207 No one said it would be impossible to make money of it. Though imo it makes more sense to use the skills learned by working on an open source robowaifu to make some money. Some people make money with AI illustrations or YouTube videos about AI now. We have some threads on making money, so please don't bring it up again and again in every thread. Btw, a product is not a patent and can't necessary be patented. Also, international patens cost 100k or more, for all I know.
>>23207 >>23217 I don't think there's anything specific to patent unless you want to engage in patent-trolling, and may touch on pre-existing patents most certainly established by large companies to protect their 'intellectual property', and as for 'international', it's near pointless considering countries like China and India don't respect those.If you have a product that sells well and they're able to mimic it, you better hope yours has a good hold on the market. I mean, you could try to patent the idea of a physical/virtual a.i assistant and hope that it holds up when companies start to challenge your rights to the concept, you might get a pay-off or two from some companies before getting rolled over by one of the big dogs. Your best bet is to have a prototype of some kind, no matter what it is, and try to get people interested in that. By the way, this is where you guys are going to get your robowaifu's from when they're actually ready for consumer level sales., and it's funny because these guys basically ripped research from every major western developer they could, and now apparently someone else in china is doing it to them. https://dsdollrobotics.com/
>>23219 kinda unrelated but do any of these sex doll companies pivoting to robotics have an actual product? I've been hearing of an imminent release of a sex robot from these companies for years now and nothing's come of it.
>>23223 those sex dolls can suck my dick if they existed. I''m going to try to do this thing regardless of any homoglobo company that fails to deliver.
>>23223 The reason for this might be that there's not much reporting. I don't want to look it up right now but at least one or two Chinese companies have something animated and for all I know, Harmony is also an available product.
>>23233 harmony can... oh no wait it can't. It just talks.
>>23219 >By the way, this is where you guys are going to get your robowaifu's from when they're actually ready for consumer level sales. Lol. Why would we do that when we're creating them ourselves Anon. BTW, we've been talking about that company here on this board for 4-5 years now, so nothing really new to us here.
>Tast's Robots >Software for open source robots, legged and/or wheeled https://tasts-robots.org/ https://github.com/tasts-robots Related: >>23924
>>23233 been hearing about Harmony, and similar models like Solana since atleast 2018. Nothing actually came of them. They were a dead-end. I remember how everyone in the manosphere, especially the MGTOW circles I was in were ecstatic that we'll have Detroit Become Human, which also came out that year, tier androids in just a matter of years.
>>23943 >especially the MGTOW circles I was in were ecstatic that we'll have Detroit Become Human, which also came out that year, tier androids in just a matter of years. While that's rather a vague time-specification, I think we all understand what you mean Anon. It just goes to show you that the complexities involved in what we're all working towards here are neither simplistic, nor easy, nor going to be readily-solved well with just a dilettante's-tier effort. Serious engineering on a grand scale is needed; this is an expeditionary trek up to a yuge mountain's peak, not a hop and a skip over to the local pharmacy! :^) Thankfully there is a growing army of associated researchers & engineers around the world, all driving towards a very diverse set of robotics & AI goals many of which will directly assist us here at /robowaifu/. And we will all get there... patience! The simple fact that all of us are even joined together here today, rationally considering this amazing goal we share in common shows just how far we've already come with computing -- and all in about half the time that cars and planes have been advancing along their own directed-evolution pathways. I predict this highly accelerated rate of improvements will only increase with time. > It has been -- and will continue to be -- an amazing ride! Just keep moving forward Anons. :^) >=== -prose edit
Edited last time by Chobitsu on 07/13/2023 (Thu) 20:31:09.
>>23943 > since atleast 2018 Five years ago... > that we'll have Detroit Become Human Idk, some might have been like that maybe. I think you're exaggerating. We'll just need some basic robowaifu, though. The unrealistic hope of some guys is, that they would do all the chores at home or at least make a sandwich. Worse levels of too high expectations are rare. The expectations are mostly not close to something like Westworld, or this aforementioned franchise (which I never even looked into since it was from the start clearly framed as "owning robo-slaves is evil and they will go rampage"). Also, we would be much closer if more of those men would've put any effort into it. Especially picking some area and working on that. That said, some did, and things did progress a lot. Especially in AI. Other areas also progressed, it's just about having an overview and putting the pieces correctly together, which is still hard. >>23951 Yeah, we're still moving. Slower than we should, but still going. Thanks for the encouraging speech.
>>23961 >Thanks for the encouraging speech. Y/w. We're all in this together! :^)
Open file (1.45 MB 1000x642 ClipboardImage.png)
Open file (497.73 KB 536x357 ClipboardImage.png)
https://docs.poppy-project.org/en/assembly-guides/poppy-humanoid/bom.html the stl files are available at the link above. When you include a computer in the chassis or use Lan mic and speaker for a character model running llama 2 (runs on a cell phone) you are basically 95% of the way there to a complete chi. This robot can walk with you by holding it's hand. Add cloth like hosiery over the body form to simulate skin for now. We're basically here bros. It's time. https://huggingface.co/blog/llama2
Open file (610.80 KB 744x654 ClipboardImage.png)
Components, servos, electronics, development and 3D printing - all cost about $ 1,200 many people here have spent more than that for something that doesn't actually work. https://www.youtube.com/watch?v=KmPo8nHw2FQ This robot ACTUALLY works. you can hook it into llamav2, and custom train the model to respond to you and have deep conversations. https://huggingface.co/spaces/ysharma/Explore_llamav2_with_TGI (try it out! You can run it locally!) note, stuff like https://www.twitch.tv/alwaysbreaktime exists now. We're here, it's time to build the thing.
>>24531 Are you, yourself, planning on building this project Anon? If so, then I'll leave it up as a thread. Otherwise it will be merged with already-existing threads. Poppy Project is indeed a good project for R&D. We've known about it here within a month of it's inception.
>>24536 What happened to my post?
>>24538 You can find it in the Chikun coop. I've warned you repeatedly about such posts here Anon. NYPA.
>>24539 you all have fun without me then
>>24535 Calm down we are not getting left behind. I’m quickly typing this out on a phone so excuse any mistakes, but I hate to see someone get discouraged. I think OP is underestimating the amount of work left. I’ll comment mainly on the software side because that’s where I am focused. I feel there is a large disconnect from reality and people’s expectations to what a LLM is. If you want a husk for a robowaifu then sure stick llama2 in and call it a day. Large language models are not magic or a full mind. Your waifu will not have robust memory or opinions that are consistent. She will have no model of the world. There will be no internal emotional state or will she be aware of the passage of time and be able to passively observe things. LLM as a mind are a red herring and will make your waifu a literal wordcel. Remember all an LLM is doing is predicting the next token. It’s amazing how much that can achieve and I don’t want to take away from the amazing work deep learning has brought us. The way I envision a waifu mind being constructed is with a symbiosis of neural networks and classical ai / computing. I'll start a discussion on this board and share some observations and thoughts on this topic later (it’s off topic). I’m new here, this is actually my first post, I have found this place about a week ago and have been lurking. I have been working on and off on the artificial “waifu” problem independently for a long time as a hobby, although I referred to it as crafting a “chat bot”, I started back when AIML was the thing.
>>24546 >I'll start a discussion on this board and share some observations and thoughts on this topic later (it’s off topic). Please do so Anon. You have a much better understanding of the general topic of a robowaifu mind than the bulk of the population does ATM. A language model is just one part of a vast set of interwoven tasks that all have to be solved together to have a working robowaifu. >I’m new here, this is actually my first post, I have found this place about a week ago and have been lurking. Welcome Anon! Glad you're here.
>>24546 You're not going to succeed at making chobitsu. You don't know what the goal is you just have this idea you can do a robot human. I know what my goal is, it's a sex bot and I will build the robot based on that.
>>24546 It's great you're here. What do you think the odds are to make a robowaifu that could learn basic commands like a dog right now. Come here. Go over there , simple stuff. I realize a dog has a rich internal state and I don't think that's in the cards but...basics? And this is assuming a commodity processor. Not the latest and greatest but maybe a decent game machine type.
>>24550 Processing power is not an issue if you assume the waifu is going to get most of it from the internet via APIs and whatnot. Sure there's stuff like jetson which are more powerful than a raspberry pi for llms but its still not enough to make something that's good, they can also get quiet expensive. And now that I think about it there's no reason why the waifu can't have a 4/5g connection anyways. Since my goal is a sex bot though I know what to keep and what to take away. Her walking in uneven terrain doesn't matter to me, her being smart doesn't matter that much, her having visual recognition would be a bonus. Its far more important that she be able to give a hand job without hurting the handjob recipient. What a goal does to priorities.
>>24549 >You're not going to succeed at making chobitsu. Au contraire. None of us knows the future of course, but there are few enclaves of private individuals more likely to succeed at this grand adventure of devising real-world, opensource, quality robowaifus for the average man than /robowaifu/ is. Time alone will tell. :^) >You don't know what the goal is you just have this idea you can do a robot human. Ehh, it's a good point -- no real debate from me on this point in general. And the reason is simple: No one in the entirety of all human history has ever achieved what we're attempting here. Simple as. >I know what my goal is, it's a sex bot and I will build the robot based on that. It's a reasonable place to begin with for the shell of course (and a position we've held here on this board since our very first week of existence). But the majority of us want much, much more than just a sexbot from our robowaifus and some here don't even care about that aspect of it; we're all striving for wholesome, loving, and supportive companionship from our robowaifus. That's a very tall order I'm well-aware. But still, this is our endgame here, Pete. Cheers. :^) >=== -prose edit
Edited last time by Chobitsu on 08/10/2023 (Thu) 16:12:59.
>>24553 >if you assume the waifu is going to get most of it from the internet via APIs and whatnot. I don't think even a single honest anon here is eager to have the Globohomo surveillance state/rental property managers right there in the home with him, Anon. >tl;dr A robowaifu has to operate fully-autonomously, using nothing for her runtime needs but what anon has with him there inside his home. Never connect your robowaifu directly to the Internet! >=== -prose edit
Edited last time by Chobitsu on 08/10/2023 (Thu) 16:20:11.
>>24550 Technologies you would be interested can be found by tooling SLAM robot navigation, current robo vacuums have ok indoor navigation and positioning. They tend to use LiDAR and a camera pointing at the ceiling. They create a 2d room shape map and ceiling landmarks to figure out their location. My robot vacuum has half a gig of ram on its SOC. I have not investigated this in depth but hope that is useful information. For speech recognition of simple voice commands, that has been possible for a long time on low power hardware. Look up PocketSphinx. What your asking for is possible on low end hardware, look at the pre ers1000 sony aibo dogs :) >>24549 My goal is not human level agi, not even close. But your correct I am looking for more then a sex bot, I’m more interested in the AI side then the cooming aspect. Plus this is a hobby I’m not looking for fast results and I enjoy the difficulty of the problem, working on this has helped me develop my skills.
>>24556 Thanks. I expected this would be so. The next step is to make some sort of LLM but instead it would be a Large Behavior Model LBM. Maybe something like a hopped up "ELIZA" model crossed with some LLM to have general knowledge. Then train it to interact in a fairly standard way. I mentioned here before that 1950's female home economics videos and text would be excellent. They have all sorts of videos on how to make a Man comfortable when that mattered to Women. If you could instill that sort of behavior, it would be GREAT.
>>24531 If you or anyone else here actually begin assembling a Poppy Project headpat daughteru, then feel free to begin a project thread for it then. Cheers. :^)
Thoughts on the Cuteroid Project? https://www.cutieroid.com/Home Their model is dolled up (heh) to look like an anime character with the face being a dynamic projection onto a screen, a similar design might be adapted with the new flexible displays. Mechanical function is all hidden under a rigid plastic shell, so I can't say what the drive system is like.
>>24637 I mentioned her in the humanoid robot videos thread: >>17815 Interesting project, but she's neither mobile nor a soft animated doll, and I think also not open source. So it's good to look at the project existing, but that's it. I think she's meant for show, like an idol. It's similar to what missile_39 or 61laboratory does. Or many others, ... DS Doll, Ameca, that redhat from Japan where I forgot the name, ... this is maybe the most widespread approach, but not useful for intercourse, house work, cuddling, or doing anything mobile. It's basically just more advanced animatronics without pneumatics or hydraulics. It's fine for showing some things, I'm have a somewhat similar project which is currently a bit stalled ("Joystick waifu", "simple body" ), but it has it's limits, especially if it's also not open source.
>>24633 >assembling a Poppy Project headpat daughteru I think it's still 7k for the actuators, until they got cheaper. And walking is the wrong priority, people keep forgetting that.
>>24532 If it's $1200 its not really a problem really. That's around the ball park firgure I was aiming at. At that point there's not much to discuss and sophie dev already made the head too. While sophie's dev's robot mouth moves up and down that'd be okay with an anime girl with a tiny mouth. Could combine both. Is it $1200 though?
>>24531 > archive-related : (>>24807)
So these devices have existed for a while now. Two wheeled self balancing robot kits. With a base like this you could solve the mobility issue mostly. https://www.instagram.com/p/CvZftxuLyeR/?igshid=MzRlODBiNWFlZA==
>>25269 I think that would be a great choice Anon. Please let us all know in our prototypes thread about your progress if you undertake this! Cheers. :^)
16+ Coolest 3D Printed Robotics Projects (2023 Update): https://www.3dsourced.com/feature-stories/3d-printed-robotics-robots/
>>26767 Thanks NoidoDev. That is a nice listing. I hope some of them who haven't yet, will eventually take the opensauce-Pill. >=== -minor edit
Edited last time by Chobitsu on 12/02/2023 (Sat) 23:29:55.
Open-source robomeido for the low, low cost of $31,757.86. Mobile ALOHA augments the ALOHA system with a mobile base and a whole-body teleoperation interface to collect training data for supervised behavior cloning. They released the hardware code, training code and training data under a MIT license, including a list of the hardware used and where they bought it. https://mobile-aloha.github.io/ >Imitation learning from human demonstrations has shown impressive performance in robotics. However, most results focus on table-top manipulation, lacking the mobility and dexterity necessary for generally useful tasks. In this work, we develop a system for imitating mobile manipulation tasks that are bimanual and require whole-body control. >We first present Mobile ALOHA, a low-cost and whole-body teleoperation system for data collection. It augments the ALOHA system with a mobile base, and a whole-body teleoperation interface. Using data collected with Mobile ALOHA, we then perform supervised behavior cloning and find that co-training with existing static ALOHA datasets boosts performance on mobile manipulation tasks. >With 50 demonstrations for each task, co-training can increase success rates by up to 90%, allowing Mobile ALOHA to autonomously complete complex mobile manipulation tasks such as sauteing and serving a piece of shrimp, opening a two-door wall cabinet to store heavy cooking pots, calling and entering an elevator, and lightly rinsing a used pan using a kitchen faucet. Previous ALOHA: https://tonyzhaozh.github.io/aloha/
>>28128 That's great, I had some videos on my radar, but didn't watch them. This might be good for guys who really want something doing chores at home, but their early waifu won't be able to do such things. So they can compartmentalize. It's also help reducing the necessary number of people in the workforce. I'm sure the prices will come down a lot within a decade. >teleoperation system for data collection Then it might be useful later to train other robots, e.g. robowaifus.
Open file (51.20 KB 750x953 pib_2022.jpg)
Open file (52.56 KB 1000x667 pib_Hand_2022.jpg)
Open file (334.87 KB 1024x507 pib-assembly-BOM.png)
Printable Intelligent Bot (PIB) is an open-source project free to download and experiment with, with parts available to purchase if you don't have a printer. It currently uses hobby servos and Raspberry Pi to operate: https://pib.rocks/ Older version .stl files (with the humanoid face) on thingiverse: https://www.thingiverse.com/thing:5276671 https://www.youtube.com/@pib_rocks I'm seriously considering a mash-up of this with a MaSiro chassis, upscaled BJD body parts and an anime kigurumi head.
>>29187 >I'm seriously considering a mash-up of this with a MaSiro chassis, upscaled BJD body parts and an anime kigurumi head. Okay, good luck. Why this one in particular? How do you want to combine these? Related: >>18281
The MaSiro robot has a distinct advantage over many other designs in that it can change its height to include picking up objects from a floor or standing up to use normal counter-height apliances or tools. It also seams like an excellent starter project that could be made very inexpensively with some basic changes and a little scrounging, with an eye towards gradual upgrades with the goal of bipedal walking. For example: Replace all of the extruded aluminum with wood or plywood. It would all be replaced over time so it is an unnecessary expense that amounts to over-building. Use hinges instead of bearings in the "leg" structure- they are temporary so lets not waste money. Use an old hoverboard for the drive wheels and possibly use the battery if it still has useful capacity. I have 5 hoverboards that cost a total of $7 USD. At least one should work. Only one charger though, so I may have to get a replacement from ebay. Use castors from office chairs instead of industrial units, again they are temporary, and I see an office chair put out for trash every week. Use rollerblade and scooter bearings in the "hips" and body tilting/rotation structure- again destined to be replaced in future upgrades so go cheap. After a lifetime of repairs, other project's leftovers, taking up associated hobbies and scrounging in anticipation of robot building, I can probably build the MaSiro chassis (modified) for $)0.0, including the upper body interior structure. Any anon should probably be able to do the same for <$100 USD with a little luck in the scrounging, and assuming no additional expenses for tools. The first phase of modification (after initial full completion) would be splitting the "leg" frame into two legs independently driven to achieve the standing/kneeling positions. For the upper body exterior an upscaling of a BJD from thingiversere, much like the work of hamcat_mqq >>7707 would serve as a starting place, and a kigurumi full head mask could be down-sized for the head. A kigurumi mask doesn't have the additional wall thickness of a BJD head, and so wouldn't need to be modified to get more room for eye mechanics and electronics. The hands and arms of PIB and inmoov, and any other robots that might be suitable would be mixed for best results. By "suitable I mean designed from the outside-in to be human shaped and work, as opposed to designed from the inside out to replicate human movement with little or no regard for human shape. Examples of the latter include the LAD >>29019, Youbionic >>28680 and ASPIR >>20841- Don't get me started on servo wheels in the palms of hands. The MaSiro arms/hands are too weak and the early version of the hands were more skeletal than anything else. The weakness can be partially overcome by increasing the overall size of the body. I'm 5'9" tall and I want my robowaifu in heels to eventually be close to my height, so her body proportions will be somewhat larger than the relatively small MaSiro, allowing larger more powerful servos to be used. In addition I will be replicating the "staggered" musculature of the human body where-ever possible (does anyone know of a term for this?). By that I mean the servos/actuators for the hands and wrists are in the forearms, the forearm/elbow mech is in the upper arms, the upper arm mech is in the shoulders, etc. All cable driven so the mass can be moved closer to the body to reduce moment loads in the arms, and provide more bending torque than having the servo shaft at the joint pivot. The cost of electronics, actuators and other expenses will largely depend on the onboard CPU/OS that I decide to use, a Raspberry Pi to start with If I go that route. And that would be a prototype hard-shell robowaifu that could be enjoyed/trained/educated (aloha) while being slowly upgraded over time to a soft endo skeletal gynoid- or at least as much so as my skills would allow.
>>29192 Hey, I didn't expect such a long answer, I should've advised you to lay out your concept in meta or in >>24152 and as soon as you start building something we have the prototype thread. We also have a thread on hands: >>4577 Someone here is working on a wheeled chassis, btw: >>24744 idk if Masiro's is freely available.
>>29187 Wow, this is really cool-looking, Anon. >>29192 Thanks for the excellent post, Robophiliac. Really looking forward to updates about your progress! Cheers. :^)
>>24531 >>24832 >>4436 >>9613 https://www.poppy-project.org/en/ The Poppy robot inspires a lot of interest, especially in those who see it "walking" on a treadmill with help from handlers. At least, until they find out how much it will cost to build. This is mainly because of the Dynamixel servos it uses. Once you add up the different BOMs for each section you find you will need: https://www.robotis.us/dynamixel/ 2 x AX18A @ $109.90 ea 19 x MX28T or AT @ $289.90 ea 5 x MX64T or AT @ $369.90 ea You will also need the proprietary connector cables, controller boards, power supply(s) and associated hardware listed on the Poppy site. And that's usually enough information to kill any interest people may have had in this robot. Thankfully, however, it appears it's possible to use modified hobby servos in place of the Dynamixels, although this probably involves tweaking the operating software of the robot. Apparently the design files have been available for a while and have been modified to use standard hobby servos. They are available here, via the Hackaday link above: https://github.com/GlenSearle/tentacle/blob/master/servo-s2309s.scad How do you use them? No idea. But this guy does v. He remixed Poppy for MG996R hobby servos, ie; standard size: https://www.thingiverse.com/thing:3992150 The sharp-eyed looking at Aster will realize that he also changed the design of the pelvis, so it won't be as maneuverable as original Poppy. Don't worry, you can still use original Poppy parts (or any other printable robot that uses dynamixel servos) with this dynamixel servo case: https://www.thingiverse.com/thing:3487979 Or this one: https://www.thingiverse.com/thing:476030 Standard Servo to Dynamixel AX-12A Mod (smaller dynamixels for smaller robots): https://www.thingiverse.com/thing:3836662 Looking at the stall-torque numbers for Dynamixels it appears the MX28T is a 25kg/cm servo and the MX64T is a 60kg/cm servo, approximately, so those should probably be the minimum power rating you would want. Of course since they all come from china, be sure to look for reviews on youtube for any servo you are considering buying. Metal gear only, preferably with an aluminum case for heat dissipation. Your hobby servos will not have one crucial feature which you will need to have- a feedback output. This is just a wire connected to the center tap of the servo's position sensing potentiometer and run out of the servo case. This is so the voltage can be read by a controller with an analog input, converted to a digital value and then used by the robot's software to confirm the servo's position. You will find many how-to videos on youtube for this by searching "servo feedback mod" but I recommend this one, as it shows the mod and how it is used in a robot using the software shown: https://www.youtube.com/watch?v=lppPC2XuMsU The software is available here and the free version will do for this. Scroll down the page to see what types of controllers are supported. https://synthiam.com/Products/ARC Note that this is a "tethered" robot control system. The software runs on your pc and sends data back and forth with the robot so reaction time may not be the same as with an onboard system. Howeve,r if your robot is large enough for a pc and your controller supports a usb connection that shouldn't be a problem. One last note about the Poppy. Human ankles have a ball-and-socket type joint but Poppy has a hinge. Why? You will need to modify the ankles if you want software that is cross-platform for a waifubot type. Similarly Poppy's hands are cosmetic- They don't work. you can probably use the inmoov hands scaled down for 8g to 9g servos. for this you will need a servo pulley to fit them shown here: https://www.thingiverse.com/thing:544321 You are on your own as far as how much of inmoov's arm to graft onto Poppy and how to do it. I've probably missed something but if you want to try Poppy, this should get you started. Cheers Videos: Poppy project: https://www.youtube.com/c/PoppyprojectOrgVideos https://github.com/poppy-project/poppy-humanoid Intrinsically Motivated Goal Exploration, Automated Curriculum Learning and Emergence of Tool Use https://www.youtube.com/watch?v=NOLAwD4ZTW0
>>29372 POTD Thanks very kindly for all the quality research, Anon! Cheers. :^)
>>29372 Thanks, this is very good news and could be enormously useful. I want to highlight this part: >feedback output. This is just a wire connected to the center tap of the servo's position sensing potentiometer and run out of the servo case. We need to keep an eye on this and how we can put this into every servo we build ourselves.
Open file (412.20 KB 826x333 Screenshot_257.png)
I wasn' sure where to put this: >We present Universal Manipulation Interface (UMI) -- a data collection and policy learning framework that allows direct skill transfer from in-the-wild human demonstrations to deployable robot policies. UMI employs hand-held grippers coupled with careful interface design to enable portable, low-cost, and information-rich data collection for challenging bimanual and dynamic manipulation demonstrations. To facilitate deployable policy learning, UMI incorporates a carefully designed policy interface with inference-time latency matching and a relative-trajectory action representation. The resulting learned policies are hardware-agnostic and deployable across multiple robot platforms. Equipped with these features, UMI framework unlocks new robot manipulation capabilities, allowing zero-shot generalizable dynamic, bimanual, precise, and long-horizon behaviors, by only changing the training data for each task. We demonstrate UMI’s versatility and efficacy with comprehensive real-world experiments, where policies learned via UMI zero-shot generalize to novel environments and objects when trained on diverse human demonstrations. https://umi-gripper.github.io/ https://github.com/real-stanford/universal_manipulation_interface

Report/Delete/Moderation Forms
Delete
Report