/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


Robotics sites, organizations and projects Robowaifu Technician 09/16/2019 (Mon) 04:21:24 No.268
There are a lot of robotics research and projects going on out there in the world, many of them with direct application for us here. Please contribute important, quality links you find that you think would help /robowaifu/.

Robot Operating System (ROS)
www.ros.org/
>[ROS] is a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms.
Open file (610.80 KB 744x654 ClipboardImage.png)
Components, servos, electronics, development and 3D printing - all cost about $ 1,200 many people here have spent more than that for something that doesn't actually work. https://www.youtube.com/watch?v=KmPo8nHw2FQ This robot ACTUALLY works. you can hook it into llamav2, and custom train the model to respond to you and have deep conversations. https://huggingface.co/spaces/ysharma/Explore_llamav2_with_TGI (try it out! You can run it locally!) note, stuff like https://www.twitch.tv/alwaysbreaktime exists now. We're here, it's time to build the thing.
>>24531 Are you, yourself, planning on building this project Anon? If so, then I'll leave it up as a thread. Otherwise it will be merged with already-existing threads. Poppy Project is indeed a good project for R&D. We've known about it here within a month of it's inception.
>>24536 What happened to my post?
>>24538 You can find it in the Chikun coop. I've warned you repeatedly about such posts here Anon. NYPA.
>>24539 you all have fun without me then
>>24535 Calm down we are not getting left behind. I’m quickly typing this out on a phone so excuse any mistakes, but I hate to see someone get discouraged. I think OP is underestimating the amount of work left. I’ll comment mainly on the software side because that’s where I am focused. I feel there is a large disconnect from reality and people’s expectations to what a LLM is. If you want a husk for a robowaifu then sure stick llama2 in and call it a day. Large language models are not magic or a full mind. Your waifu will not have robust memory or opinions that are consistent. She will have no model of the world. There will be no internal emotional state or will she be aware of the passage of time and be able to passively observe things. LLM as a mind are a red herring and will make your waifu a literal wordcel. Remember all an LLM is doing is predicting the next token. It’s amazing how much that can achieve and I don’t want to take away from the amazing work deep learning has brought us. The way I envision a waifu mind being constructed is with a symbiosis of neural networks and classical ai / computing. I'll start a discussion on this board and share some observations and thoughts on this topic later (it’s off topic). I’m new here, this is actually my first post, I have found this place about a week ago and have been lurking. I have been working on and off on the artificial “waifu” problem independently for a long time as a hobby, although I referred to it as crafting a “chat bot”, I started back when AIML was the thing.
>>24546 >I'll start a discussion on this board and share some observations and thoughts on this topic later (it’s off topic). Please do so Anon. You have a much better understanding of the general topic of a robowaifu mind than the bulk of the population does ATM. A language model is just one part of a vast set of interwoven tasks that all have to be solved together to have a working robowaifu. >I’m new here, this is actually my first post, I have found this place about a week ago and have been lurking. Welcome Anon! Glad you're here.
>>24546 You're not going to succeed at making chobitsu. You don't know what the goal is you just have this idea you can do a robot human. I know what my goal is, it's a sex bot and I will build the robot based on that.
>>24546 It's great you're here. What do you think the odds are to make a robowaifu that could learn basic commands like a dog right now. Come here. Go over there , simple stuff. I realize a dog has a rich internal state and I don't think that's in the cards but...basics? And this is assuming a commodity processor. Not the latest and greatest but maybe a decent game machine type.
>>24550 Processing power is not an issue if you assume the waifu is going to get most of it from the internet via APIs and whatnot. Sure there's stuff like jetson which are more powerful than a raspberry pi for llms but its still not enough to make something that's good, they can also get quiet expensive. And now that I think about it there's no reason why the waifu can't have a 4/5g connection anyways. Since my goal is a sex bot though I know what to keep and what to take away. Her walking in uneven terrain doesn't matter to me, her being smart doesn't matter that much, her having visual recognition would be a bonus. Its far more important that she be able to give a hand job without hurting the handjob recipient. What a goal does to priorities.
>>24549 >You're not going to succeed at making chobitsu. Au contraire. None of us knows the future of course, but there are few enclaves of private individuals more likely to succeed at this grand adventure of devising real-world, opensource, quality robowaifus for the average man than /robowaifu/ is. Time alone will tell. :^) >You don't know what the goal is you just have this idea you can do a robot human. Ehh, it's a good point -- no real debate from me on this point in general. And the reason is simple: No one in the entirety of all human history has ever achieved what we're attempting here. Simple as. >I know what my goal is, it's a sex bot and I will build the robot based on that. It's a reasonable place to begin with for the shell of course (and a position we've held here on this board since our very first week of existence). But the majority of us want much, much more than just a sexbot from our robowaifus and some here don't even care about that aspect of it; we're all striving for wholesome, loving, and supportive companionship from our robowaifus. That's a very tall order I'm well-aware. But still, this is our endgame here, Pete. Cheers. :^) >=== -prose edit
Edited last time by Chobitsu on 08/10/2023 (Thu) 16:12:59.
>>24553 >if you assume the waifu is going to get most of it from the internet via APIs and whatnot. I don't think even a single honest anon here is eager to have the Globohomo surveillance state/rental property managers right there in the home with him, Anon. >tl;dr A robowaifu has to operate fully-autonomously, using nothing for her runtime needs but what anon has with him there inside his home. Never connect your robowaifu directly to the Internet! >=== -prose edit
Edited last time by Chobitsu on 08/10/2023 (Thu) 16:20:11.
>>24550 Technologies you would be interested can be found by tooling SLAM robot navigation, current robo vacuums have ok indoor navigation and positioning. They tend to use LiDAR and a camera pointing at the ceiling. They create a 2d room shape map and ceiling landmarks to figure out their location. My robot vacuum has half a gig of ram on its SOC. I have not investigated this in depth but hope that is useful information. For speech recognition of simple voice commands, that has been possible for a long time on low power hardware. Look up PocketSphinx. What your asking for is possible on low end hardware, look at the pre ers1000 sony aibo dogs :) >>24549 My goal is not human level agi, not even close. But your correct I am looking for more then a sex bot, I’m more interested in the AI side then the cooming aspect. Plus this is a hobby I’m not looking for fast results and I enjoy the difficulty of the problem, working on this has helped me develop my skills.
>>24556 Thanks. I expected this would be so. The next step is to make some sort of LLM but instead it would be a Large Behavior Model LBM. Maybe something like a hopped up "ELIZA" model crossed with some LLM to have general knowledge. Then train it to interact in a fairly standard way. I mentioned here before that 1950's female home economics videos and text would be excellent. They have all sorts of videos on how to make a Man comfortable when that mattered to Women. If you could instill that sort of behavior, it would be GREAT.
>>24531 If you or anyone else here actually begin assembling a Poppy Project headpat daughteru, then feel free to begin a project thread for it then. Cheers. :^)
Thoughts on the Cuteroid Project? https://www.cutieroid.com/Home Their model is dolled up (heh) to look like an anime character with the face being a dynamic projection onto a screen, a similar design might be adapted with the new flexible displays. Mechanical function is all hidden under a rigid plastic shell, so I can't say what the drive system is like.
>>24637 I mentioned her in the humanoid robot videos thread: >>17815 Interesting project, but she's neither mobile nor a soft animated doll, and I think also not open source. So it's good to look at the project existing, but that's it. I think she's meant for show, like an idol. It's similar to what missile_39 or 61laboratory does. Or many others, ... DS Doll, Ameca, that redhat from Japan where I forgot the name, ... this is maybe the most widespread approach, but not useful for intercourse, house work, cuddling, or doing anything mobile. It's basically just more advanced animatronics without pneumatics or hydraulics. It's fine for showing some things, I'm have a somewhat similar project which is currently a bit stalled ("Joystick waifu", "simple body" ), but it has it's limits, especially if it's also not open source.
>>24633 >assembling a Poppy Project headpat daughteru I think it's still 7k for the actuators, until they got cheaper. And walking is the wrong priority, people keep forgetting that.
>>24532 If it's $1200 its not really a problem really. That's around the ball park firgure I was aiming at. At that point there's not much to discuss and sophie dev already made the head too. While sophie's dev's robot mouth moves up and down that'd be okay with an anime girl with a tiny mouth. Could combine both. Is it $1200 though?
>>24531 > archive-related : (>>24807)
So these devices have existed for a while now. Two wheeled self balancing robot kits. With a base like this you could solve the mobility issue mostly. https://www.instagram.com/p/CvZftxuLyeR/?igshid=MzRlODBiNWFlZA==
>>25269 I think that would be a great choice Anon. Please let us all know in our prototypes thread about your progress if you undertake this! Cheers. :^)
16+ Coolest 3D Printed Robotics Projects (2023 Update): https://www.3dsourced.com/feature-stories/3d-printed-robotics-robots/
>>26767 Thanks NoidoDev. That is a nice listing. I hope some of them who haven't yet, will eventually take the opensauce-Pill. >=== -minor edit
Edited last time by Chobitsu on 12/02/2023 (Sat) 23:29:55.
Open-source robomeido for the low, low cost of $31,757.86. Mobile ALOHA augments the ALOHA system with a mobile base and a whole-body teleoperation interface to collect training data for supervised behavior cloning. They released the hardware code, training code and training data under a MIT license, including a list of the hardware used and where they bought it. https://mobile-aloha.github.io/ >Imitation learning from human demonstrations has shown impressive performance in robotics. However, most results focus on table-top manipulation, lacking the mobility and dexterity necessary for generally useful tasks. In this work, we develop a system for imitating mobile manipulation tasks that are bimanual and require whole-body control. >We first present Mobile ALOHA, a low-cost and whole-body teleoperation system for data collection. It augments the ALOHA system with a mobile base, and a whole-body teleoperation interface. Using data collected with Mobile ALOHA, we then perform supervised behavior cloning and find that co-training with existing static ALOHA datasets boosts performance on mobile manipulation tasks. >With 50 demonstrations for each task, co-training can increase success rates by up to 90%, allowing Mobile ALOHA to autonomously complete complex mobile manipulation tasks such as sauteing and serving a piece of shrimp, opening a two-door wall cabinet to store heavy cooking pots, calling and entering an elevator, and lightly rinsing a used pan using a kitchen faucet. Previous ALOHA: https://tonyzhaozh.github.io/aloha/
>>28128 That's great, I had some videos on my radar, but didn't watch them. This might be good for guys who really want something doing chores at home, but their early waifu won't be able to do such things. So they can compartmentalize. It's also help reducing the necessary number of people in the workforce. I'm sure the prices will come down a lot within a decade. >teleoperation system for data collection Then it might be useful later to train other robots, e.g. robowaifus.
Open file (51.20 KB 750x953 pib_2022.jpg)
Open file (52.56 KB 1000x667 pib_Hand_2022.jpg)
Open file (334.87 KB 1024x507 pib-assembly-BOM.png)
Printable Intelligent Bot (PIB) is an open-source project free to download and experiment with, with parts available to purchase if you don't have a printer. It currently uses hobby servos and Raspberry Pi to operate: https://pib.rocks/ Older version .stl files (with the humanoid face) on thingiverse: https://www.thingiverse.com/thing:5276671 https://www.youtube.com/@pib_rocks I'm seriously considering a mash-up of this with a MaSiro chassis, upscaled BJD body parts and an anime kigurumi head.
>>29187 >I'm seriously considering a mash-up of this with a MaSiro chassis, upscaled BJD body parts and an anime kigurumi head. Okay, good luck. Why this one in particular? How do you want to combine these? Related: >>18281
The MaSiro robot has a distinct advantage over many other designs in that it can change its height to include picking up objects from a floor or standing up to use normal counter-height apliances or tools. It also seams like an excellent starter project that could be made very inexpensively with some basic changes and a little scrounging, with an eye towards gradual upgrades with the goal of bipedal walking. For example: Replace all of the extruded aluminum with wood or plywood. It would all be replaced over time so it is an unnecessary expense that amounts to over-building. Use hinges instead of bearings in the "leg" structure- they are temporary so lets not waste money. Use an old hoverboard for the drive wheels and possibly use the battery if it still has useful capacity. I have 5 hoverboards that cost a total of $7 USD. At least one should work. Only one charger though, so I may have to get a replacement from ebay. Use castors from office chairs instead of industrial units, again they are temporary, and I see an office chair put out for trash every week. Use rollerblade and scooter bearings in the "hips" and body tilting/rotation structure- again destined to be replaced in future upgrades so go cheap. After a lifetime of repairs, other project's leftovers, taking up associated hobbies and scrounging in anticipation of robot building, I can probably build the MaSiro chassis (modified) for $)0.0, including the upper body interior structure. Any anon should probably be able to do the same for <$100 USD with a little luck in the scrounging, and assuming no additional expenses for tools. The first phase of modification (after initial full completion) would be splitting the "leg" frame into two legs independently driven to achieve the standing/kneeling positions. For the upper body exterior an upscaling of a BJD from thingiversere, much like the work of hamcat_mqq >>7707 would serve as a starting place, and a kigurumi full head mask could be down-sized for the head. A kigurumi mask doesn't have the additional wall thickness of a BJD head, and so wouldn't need to be modified to get more room for eye mechanics and electronics. The hands and arms of PIB and inmoov, and any other robots that might be suitable would be mixed for best results. By "suitable I mean designed from the outside-in to be human shaped and work, as opposed to designed from the inside out to replicate human movement with little or no regard for human shape. Examples of the latter include the LAD >>29019, Youbionic >>28680 and ASPIR >>20841- Don't get me started on servo wheels in the palms of hands. The MaSiro arms/hands are too weak and the early version of the hands were more skeletal than anything else. The weakness can be partially overcome by increasing the overall size of the body. I'm 5'9" tall and I want my robowaifu in heels to eventually be close to my height, so her body proportions will be somewhat larger than the relatively small MaSiro, allowing larger more powerful servos to be used. In addition I will be replicating the "staggered" musculature of the human body where-ever possible (does anyone know of a term for this?). By that I mean the servos/actuators for the hands and wrists are in the forearms, the forearm/elbow mech is in the upper arms, the upper arm mech is in the shoulders, etc. All cable driven so the mass can be moved closer to the body to reduce moment loads in the arms, and provide more bending torque than having the servo shaft at the joint pivot. The cost of electronics, actuators and other expenses will largely depend on the onboard CPU/OS that I decide to use, a Raspberry Pi to start with If I go that route. And that would be a prototype hard-shell robowaifu that could be enjoyed/trained/educated (aloha) while being slowly upgraded over time to a soft endo skeletal gynoid- or at least as much so as my skills would allow.
>>29192 Hey, I didn't expect such a long answer, I should've advised you to lay out your concept in meta or in >>24152 and as soon as you start building something we have the prototype thread. We also have a thread on hands: >>4577 Someone here is working on a wheeled chassis, btw: >>24744 idk if Masiro's is freely available.
>>29187 Wow, this is really cool-looking, Anon. >>29192 Thanks for the excellent post, Robophiliac. Really looking forward to updates about your progress! Cheers. :^)
>>24531 >>24832 >>4436 >>9613 https://www.poppy-project.org/en/ The Poppy robot inspires a lot of interest, especially in those who see it "walking" on a treadmill with help from handlers. At least, until they find out how much it will cost to build. This is mainly because of the Dynamixel servos it uses. Once you add up the different BOMs for each section you find you will need: https://www.robotis.us/dynamixel/ 2 x AX18A @ $109.90 ea 19 x MX28T or AT @ $289.90 ea 5 x MX64T or AT @ $369.90 ea You will also need the proprietary connector cables, controller boards, power supply(s) and associated hardware listed on the Poppy site. And that's usually enough information to kill any interest people may have had in this robot. Thankfully, however, it appears it's possible to use modified hobby servos in place of the Dynamixels, although this probably involves tweaking the operating software of the robot. Apparently the design files have been available for a while and have been modified to use standard hobby servos. They are available here, via the Hackaday link above: https://github.com/GlenSearle/tentacle/blob/master/servo-s2309s.scad How do you use them? No idea. But this guy does v. He remixed Poppy for MG996R hobby servos, ie; standard size: https://www.thingiverse.com/thing:3992150 The sharp-eyed looking at Aster will realize that he also changed the design of the pelvis, so it won't be as maneuverable as original Poppy. Don't worry, you can still use original Poppy parts (or any other printable robot that uses dynamixel servos) with this dynamixel servo case: https://www.thingiverse.com/thing:3487979 Or this one: https://www.thingiverse.com/thing:476030 Standard Servo to Dynamixel AX-12A Mod (smaller dynamixels for smaller robots): https://www.thingiverse.com/thing:3836662 Looking at the stall-torque numbers for Dynamixels it appears the MX28T is a 25kg/cm servo and the MX64T is a 60kg/cm servo, approximately, so those should probably be the minimum power rating you would want. Of course since they all come from china, be sure to look for reviews on youtube for any servo you are considering buying. Metal gear only, preferably with an aluminum case for heat dissipation. Your hobby servos will not have one crucial feature which you will need to have- a feedback output. This is just a wire connected to the center tap of the servo's position sensing potentiometer and run out of the servo case. This is so the voltage can be read by a controller with an analog input, converted to a digital value and then used by the robot's software to confirm the servo's position. You will find many how-to videos on youtube for this by searching "servo feedback mod" but I recommend this one, as it shows the mod and how it is used in a robot using the software shown: https://www.youtube.com/watch?v=lppPC2XuMsU The software is available here and the free version will do for this. Scroll down the page to see what types of controllers are supported. https://synthiam.com/Products/ARC Note that this is a "tethered" robot control system. The software runs on your pc and sends data back and forth with the robot so reaction time may not be the same as with an onboard system. Howeve,r if your robot is large enough for a pc and your controller supports a usb connection that shouldn't be a problem. One last note about the Poppy. Human ankles have a ball-and-socket type joint but Poppy has a hinge. Why? You will need to modify the ankles if you want software that is cross-platform for a waifubot type. Similarly Poppy's hands are cosmetic- They don't work. you can probably use the inmoov hands scaled down for 8g to 9g servos. for this you will need a servo pulley to fit them shown here: https://www.thingiverse.com/thing:544321 You are on your own as far as how much of inmoov's arm to graft onto Poppy and how to do it. I've probably missed something but if you want to try Poppy, this should get you started. Cheers Videos: Poppy project: https://www.youtube.com/c/PoppyprojectOrgVideos https://github.com/poppy-project/poppy-humanoid Intrinsically Motivated Goal Exploration, Automated Curriculum Learning and Emergence of Tool Use https://www.youtube.com/watch?v=NOLAwD4ZTW0
>>29372 POTD Thanks very kindly for all the quality research, Anon! Cheers. :^)
>>29372 Thanks, this is very good news and could be enormously useful. I want to highlight this part: >feedback output. This is just a wire connected to the center tap of the servo's position sensing potentiometer and run out of the servo case. We need to keep an eye on this and how we can put this into every servo we build ourselves.
Open file (412.20 KB 826x333 Screenshot_257.png)
I wasn' sure where to put this: >We present Universal Manipulation Interface (UMI) -- a data collection and policy learning framework that allows direct skill transfer from in-the-wild human demonstrations to deployable robot policies. UMI employs hand-held grippers coupled with careful interface design to enable portable, low-cost, and information-rich data collection for challenging bimanual and dynamic manipulation demonstrations. To facilitate deployable policy learning, UMI incorporates a carefully designed policy interface with inference-time latency matching and a relative-trajectory action representation. The resulting learned policies are hardware-agnostic and deployable across multiple robot platforms. Equipped with these features, UMI framework unlocks new robot manipulation capabilities, allowing zero-shot generalizable dynamic, bimanual, precise, and long-horizon behaviors, by only changing the training data for each task. We demonstrate UMI’s versatility and efficacy with comprehensive real-world experiments, where policies learned via UMI zero-shot generalize to novel environments and objects when trained on diverse human demonstrations. https://umi-gripper.github.io/ https://github.com/real-stanford/universal_manipulation_interface
Pollen Robotics created an open platform humanoid on wheels robot called Reachy. They even have a community page, though doesn't look very active. They apparently are the same people that made Poppy which was mentioned earlier. https://www.pollen-robotics.com/ Here is their github https://github.com/pollen-robotics/
>>30172 >Entire Reachy be €39,990 Please excuse my chuckle.
>>30172 >Reachy Looks like they've adopted a paradigm similar to our interim MaidCom approach. They appear to be expressly avoiding character appeal (in 12 princples of animation sense). > pic-related I'd say this is a reasonably-good example of what I mean when I say that we can have fully opensource robowaifus, and anons can still turn that into for-profit business ventures. OTOH, I'm not in any way attempting to validate this specific example, merely using it as a data point. Thanks Anon! Cheers. :^)
>>30176 A Reachy Aroundy must cost extra. >>30177 >Looks like they've adopted a paradigm similar to our interim MaidCom approach. They appear to be expressly avoiding character appeal (in 12 princples of animation sense). The wiggly antenna seem to be where all the characterization attempt was meant to go since those obviously serve no other purpose than looking a little cute wiggling around. >for-profit business ventures Problem is the costly Dynamixel they use being part of the steep price but it has some things that could be just directly copied and altered . I don't think profit motive is the best motivation, it should be an afterthought behind collaborative effort especially since this isn't some in person robotics group going here. So depends what you mean.
>>30188 >Problem is the costly Dynamixel they use being part of the steep price but it has some things that could be just directly copied and altered They are quality, but still likely overpriced at that. But ofc as the entire design is open, then other actuators can be used instead. >So depends what you mean. I'm a yuge fan of opensauce, and in my plans we're going to give away our first Model A robowaifu designs entirely free. Kits will cost money of course. Personally I want to keep the kits just barely above costs to promote the field. Entirely pre-built & tested robowaifus would cost a pretty good amount (slightly above the median industry cost of similar systems; since this is an entirely new field, this is still up in the air). Other anons are of course free to do whatever they choose to. Our own designs & software are MIT licensed, and newer works are likely to remain that way even after Model A is out in the world. I'm personally trying foster an industry, not make a quick buck. --- >also: Lol, unless you'd like me to rm your first quote, I'm going to have to move your post into Sh*tepost Central (the Basement Lounge). Confer Rule #2 : (>>3)
>>30191 SPUD's code will be free once I get it finished and nicely documented, since it is mostly copypasta. I wouldn't feel right pay-walling copypasta (a horrible business practice, having morals and ethical standards). So you could use SPUD in your own Model A robowaifu, just remember where it came from :) T̶h̶e̶ ̶t̶h̶o̶u̶g̶h̶t̶ ̶o̶f̶ ̶t̶h̶e̶ ̶g̶l̶o̶r̶i̶o̶u̶s̶ ̶c̶h̶a̶o̶s̶ ̶t̶h̶a̶t̶ ̶c̶o̶u̶l̶d̶ ̶e̶n̶s̶u̶e̶ ̶w̶h̶e̶n̶ ̶o̶p̶e̶n̶-̶s̶o̶u̶r̶c̶e̶ ̶f̶r̶e̶e̶ ̶r̶o̶b̶o̶w̶a̶i̶f̶u̶ ̶c̶o̶d̶e̶ ̶i̶s̶ ̶a̶v̶a̶i̶l̶a̶b̶l̶e̶ ̶m̶a̶k̶e̶s̶ ̶m̶e̶ ̶c̶h̶u̶c̶k̶l̶e̶.̶
>>30195 Thanks kindly for the offer, Anon. I'll be pleased to see your final released form of it! >T̶h̶e̶ ̶t̶h̶o̶u̶g̶h̶t̶ ̶o̶f̶ ̶t̶h̶e̶ ̶g̶l̶o̶r̶i̶o̶u̶s̶ ̶c̶h̶a̶o̶s̶ ̶t̶h̶a̶t̶ ̶c̶o̶u̶l̶d̶ ̶e̶n̶s̶u̶e̶ ̶w̶h̶e̶n̶ ̶o̶p̶e̶n̶-̶s̶o̶u̶r̶c̶e̶ ̶f̶r̶e̶e̶ ̶r̶o̶b̶o̶w̶a̶i̶f̶u̶ ̶c̶o̶d̶e̶ ̶i̶s̶ ̶a̶v̶a̶i̶l̶a̶b̶l̶e̶ ̶m̶a̶k̶e̶s̶ ̶m̶e̶ ̶c̶h̶u̶c̶k̶l̶e̶.̶ This. Some anons here don't yet realize the enormity of the software engineering required to achieve great robowaifus. This is pretty much one of the main reasons I've gone to the trouble to learn C++ -- to prepare for just that need. OTOH, the satanic GH corpos we're all up against (whether that seems openly apparent yet or not) are all quite well-aware of the difficulty/expense involved. When we all release our code for free, and undercut all their evil machinations thereby... the salt will be delicious! :^) Ofc, don't be foolish and think they'll just take that "assault" on their sheqel purses (or on their society-destroying plots) lying down! Prepare today to weather their actual assaults against any of us they can pin down successfully. >=== -prose edit
Edited last time by Chobitsu on 03/08/2024 (Fri) 11:28:16.
Funny i stumbled on this on accident while looking up hugging robots. Wasnt sure if I should post this here or the news thread. >Hugging Face is launching a new robotics project under former Tesla staff scientist Remi Cadene > He also said he was “looking for engineers” in Paris, France and posted a link to a job listing for an “Embodied Robotics Engineer,” which gives more clues, reading in part: >“At Hugging Face, we believe ML doesn’t have to be constrained to computers and servers, and that’s why we’re expanding our team with a new opportunity for a Robotics Engineer focusing on Machine Learning/AI. In this role, you will be responsible for designing, building, and maintaining open-source and low cost robotic systems that integrate AI technologies, specifically in deep learning and embodied AI. You will collaborate closely with ML engineers, researchers, and product teams to develop innovative solutions that push the boundaries of what’s possible in robotics and AI.“ >The listing also calls upon hires to “Design, build, and maintain open-source and low cost robotic systems integrating deep learning and embodied AI technologies” and “Build low cost robots with off the shelf electronic components and controllers and 3D printed parts.” > We’ve reached out to confirm the news with Hugging Face and ask for further information on the project. We will update when we hear back. https://venturebeat.com/ai/hugging-face-is-launching-an-open-source-robotics-project-led-by-former-tesla-scientist/ And the job listing is here https://apply.workable.com/huggingface/j/F612A84F16/
>>30205 Neat! While they are clearly GH-encumbered, HF is literally humanity's best hope for opensauce AI ATP. I wonder what they'll think when they realize the actual tsunami of consumer demand out there is for robowaifus, not just general 'embodied' agents? :DD Thanks Anon! Cheers. :^) >*fires up resume* >=== -minor edit
Edited last time by Chobitsu on 03/08/2024 (Fri) 14:23:48.
>>30203 Its rather surprising oobabooga and sillytavern are still active, really. But I suppose it is security through obscurity. That's why *technically* I'm making SPUD as an artificial booth babe. If someone wants to use my product lewd purposes... that is none of my business :)
>>30210 Yes, I think you're right. It's a strange dichotomy. There's some dynamic of the old turn of speech 'strange bedfellows' at play here IMO. < On the one hand: We here (and other Anons like us), with our clearly muhsoggyknees, literally Hitler desire for loving, helpful, and charming waifus. < On the other hand: Enthusiast groups largely funded by the evil globohomo, who want nothing more than to quadruple-down on an ultra-pozz overdose, gleefully pushing for an end to Whitey for good, and the destruction of the Patriarchy!111!!ONE!! ...both working together for the common goal of AI 'benevolence' towards all mankind! :DD Strange Days indeed, heh. :^) >=== -minor edit
Edited last time by Chobitsu on 03/09/2024 (Sat) 03:55:27.
I came across this soft robotics focused site with instructions for different actuators, mainly pneumatic. Their YouTube channel barely has followers so I may be first here to point this out. https://opensoftmachines.com/
>ROS >Gazebo >Easily install the Robot Operating System (ROS) on any Linux distribution >Want to use ROS, but don't want to run Ubuntu >This project uses the power of Nix make to it possible to develop and run ROS packages in the same way on any Linux machine. https://github.com/lopsided98/nix-ros-overlay/tree/develop
>>30887 Very nice. Thanks NoidoDev, I briefly considered this very topic a few years ago and basically figured it was a non-starter b/c CBA to try and manage moving it off of Ub*ntu . Maybe it's time to reconsider the prospect again though I anticipate there will still linger a lot of dependency hell, since ROS is a sprawling mess with a yuge attack surface.
>>30268 Thanks Anon! Very interesting approaches.

Report/Delete/Moderation Forms
Delete
Report