/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


SPUD (Specially Programmed UwU Droid) Mechnomancer 11/10/2023 (Fri) 23:18:11 No.26306
Henlo anons, Stumbled here via youtube rabbit hole & thought I'd share my little side project. Started out as just an elaborate way to do some mech R&D (making a system to generate animation files on windows blender and export/transfer them to a raspberry pi system) and found tinkering with the various python libraries a kinda neat way to pass the time when whether doesn't permit my outside mech work. Plus I'd end up with a booth babe that I don't have to pay or worry about running off with a convention attendee. Currently running voice commands via google speech and chatgpt integration but I'm looking into offline/local stuff like openchat. WEF and such are so desperate to make a totalitarian cyberpunk dystopia I might as well make the fun bits to go along with it. And yes. Chicks do dig giant robots.
Okay, that's quite far evolved but doesn't look like something for snuggling. Where do you plan to go with this? Mobility? Silicone Face? Manipulation of Objects? Also, did you have an Atari Jaguar (I did)?
Hello Mechnomancer, welcome! This is a very interesting looking project. >Plus I'd end up with a booth babe that I don't have to pay or worry about running off with a convention attendee. Yes, as Masiro Project and others demonstrate, anything remotely robowaifu-related is pretty popular at conventions. >WEF and such are so desperate to make a totalitarian cyberpunk dystopia I might as well make the fun bits to go along with it. >And yes. Chicks do dig giant robots. LOL. Indeed, the Globohomo (in all it's rapacious, octopus-like branches & divisions) is well along within their evil plots to destroy the civilizations of the Earth. Hopefully we here and others like us will present some serious obstacles to their path by helping with the big task of deconstructing feminism in our own important, practical way. And personally, I'm less concerned about the chicks digging robohusbandos, and more concerned about them as a species returning to the sanity of their God-ordained roles within a man's family. Robowaifus can actually help with that process. Looking forward very much to seeing what progress you make with your project, Anon! Cheers. :^) >=== -fmt, prose edit
Edited last time by Chobitsu on 11/11/2023 (Sat) 17:33:59.
Open file (78.41 KB 667x1000 1.jpg)
>>26307 Well, nobody is particularly cuddly without their skin on. I'm working on establishing a nice solid frame first. I'll be modifying the face a bit to cover it with some stretchy fabric, if I can't get it walking like a wowee femisapien (I got one off ebay and it is cute to see wandering around on its own https://www.youtube.com/watch?v=rNezxoSEEB0 ) I'll be giving it a mobile base like the attached mecha musume image (minus the mech arms), and some fabric/foam covering. End goal is something like a VTuber but going the other way... might be a funny co-host for a gaming channel... robowaifu onlyfans... idk. Ask it to hold small items, clock, alarm, weather, conversation, etc. Will probably release source code and maybe sell the .stl files/assembly instructions for a few bucks (like the cost of a diy-project book so I can recuperate development costs). If y'all wanna mod them to do off-brand things its no skin off my nose. >>26317 >And personally, I'm less concerned about the chicks digging robohusbandos I was making a Megas XLR reference :) Even in her unfinished state, lots of folks liked SPUD; I go to a few local cons and a local fair allows me to demonstrate my mechs/robots there in perpetuity for free, since there were folks coming to the fair specifically to see my stuff. Of course SPUD was wearing her hoodie so she looked a bit less like a necron gf XD Hope to get her mobile by con season next year, but I also have the mech to get walking on all six of her stubby legs. There's already articles about middle-aged childless women regretting their life choices. And unless they suddenly start fda approval for growing babies in ziploc baggies like that sheep, feels like the great big ol' cargoship of society is startin' to turn (I hope for the best and plan for the worst, hence my 600lb dreadnaught-type mech). Competition breeds (pun intended) innovation, so guys leaving the dating pool for a machine (no offense meant) would prolly catalyze the turnabout. Building Spud, I better understand and appreciate all the little things the human body does that we take for granted, like compensating the shifts in center of gravity from something simple as raising an arm. So many little things one learns when building stuff. >=== -patch hotlink
Edited last time by Chobitsu on 11/13/2023 (Mon) 03:16:44.
>>26341 >since there were folks coming to the fair specifically to see my stuff. Excellent. That's a great sign Anon. I don't think most anons realize what a yuge sea-change is coming related to robowaifus. The demand is going to be outrageous. Every anon honestly struggling and striving right now to effectively solve every.little.thing. will eventually have an outstanding opportunity to be a first-mover (& shaker) in this new industry I predict will literally become larger than the global automotive transportation industries before all is said and done. Just let that idea sink in for a minute! :DD >Building Spud, I better understand and appreciate all the little things the human body does that we take for granted, like compensating the shifts in center of gravity from something simple as raising an arm. So many little things one learns when building stuff. God's designs of higher life is nothing short of truly mind-boggling, if you simply open your eyes and observe. So, in the film industry, there's a sub-branch of EFX called animatronics (typically about the same as the concomitant field in the Dark Rides industry). A fairly commonplace tool for that is a so-called Monkey, which is simply a manually-adjusted motion control device fashioned into a crude approximation of the body form being controlled (much like a maquette, in the case of a humanoid). Not much by way of preprogramming numbers, code, &tc., is needed for the technician-artists driving the monkeys. They simply pose-to-pose act out the performance that they want, and the full-sized rig (whether practical or virtual) follows suit. Kind of like an indirect MoCap scenario, so to speak. All it takes is patience and good instincts; writing code &tc., isn't needed at all at this stage of the performance pipeline. Just be a natural good actor with a great sense of balance & pose. I've been working towards devising an inexpensive, bipedal-capable form for /robowaifu/, et al, and I've already started gravitating towards a monkey-like control system to program in the base movements such as walking, running, jumping, climbing, backflips, &tc. Once the animation data's recorded, then of course it can be further refined & tweaked afterwards. Eventually, we'll have a collection of software algorithms for 'bio'-kinematics control models, sufficiently accurate for retargeting; with which we can draw from ginormous amounts of professionally-mocap'd performance data to run upon our robowaifu's shells thereafter. >tl;dr It's gonna be ebin in the end!! Cheers. :^) >=== -prose edit
Edited last time by Chobitsu on 11/12/2023 (Sun) 00:27:36.
Open file (7.07 MB 246x315 powerarmorgif2.gif)
>>26345 I mean, not only do I have the WIP robowaifu but I also have a 600lb, 8ft pilotable mech and powerarmor (attached) XD What remains of megabots at one point were stalking me, quite butthurt and jelly at my work
>>26351 Man I have to say that you have an amazing project going on. The robowaifu ain't bad either. You talk about selling kits, you know how may 40k enthusiast's would buy a walking space marine suit? That's cool as fuck, great job, I'd totally have sex with that mech.
Open file (32.54 KB 600x900 Gourai.png)
>>26341 >Gourai Based Please provide details on SPUD. Height, weight, speed, power consumption, capabilities, processor, motor controller, sensors, actuators, what printer are you using, etc... Information in a succinct post to make it easy for people to understand her quickly.
>>26352 Same energy as the attached lol. >>26355 Spud specifications as of last public demo (paper ears/atari hat wearing hoodie): currently 5ft, about 30lbs, stationary, standard size servo package (similar to SG996R)with varying weight limits (9/25/35/75kg) controlled with PCA9685 servo board connected via I2C to a raspberry pi 4 3gb. Power switch to convert between AC/onboard 40v 2.5Ah lipo battery. Can create custom animations (arm movement) via "blend" file. Smol LCD to display eye icons and access Rasbian OS and scripted in Python. Voice commands include: Activate/deactivate chatgpt integration Weather (need to get a new python library as it has recently depreciated) Time Date Set alarm @ time with a message Adjust listen time between 3 to 60 seconds Debug show eye expressions TTS for chatgpt (and chatgpt can control eye expressions) Activate wave animation Current WIP (necron-looking) Separate lcd screens via SPI to free up the HDMI port so OS can be accessed while SPUD is running Physical blinking eyelids Solid wiggly ears :3 More robust neck mechanism w/ ambient movement & face detection (webcam) where eyes follow detected. Better cable management Future plans: Integration of local-ish LLM (separate tower PC and communication via wifi) Localized speech to text (anyone wanna make the pocketsphinx library not in a hard boston accent?) Replacement of linear actuators with ASMC-04B Walking legs or seated on a mobile base (or maybe both) Dedicated GUI More mouth movement via SG-90s Cloth face cover (eyelids too?) robot-like Cloth/foam covering/morphsuit
>>26306 >>26351 Unfathomably based and incredibly awesome. (but God, I wish I had a mech suit too lol) If you have time, could you elaborate about your experiences/goals at fairs and conventions? I really don’t know how to build IRL interest for my own project ( >>24744 ), nor what benefits it could bring, but it could be worthwhile to start planning ahead for them. (then again, I’ve never been to a con’, and the fairs I’ve attended all had livestock stalls and a rodeo arena— needless to say, I’m pretty clueless) >>26351 Off topic, but are there any takeaways re: armor you’d be willing to share (if you considered that with your mech)? I’m hoping to armor my design’s battery carriage enough that it can survive a few shots from a handgun, if only to avoid my multi-year project going up in flames cause some inner-city thug was having a bad day…
>>26351 Looks pretty wild Anon. Stay safe in there! :D >>26356 Nice specs on Spud, Mechnomancer. I hope you manage all your WIP & future plans. Looking forward to watching your progress ITT! Cheers. :^)
Open file (1.04 MB 878x662 chonky servoos.png)
Open file (1.17 MB 675x897 mek doggo house.png)
>>26364 About IRL stuff: 1) It always takes longer than you think or plan (setting up a booth, transportation, or building stuff) 2) Most folks understand when something is a work-in-progress or prototype. The Hard critic/butthurt naysayer is rare in meatspace 3) Unless you explicitly state you're a one-man-band doing it out-of-pocket folks will think you got university funding or somethin. My fair in bumfuk is mostly agricultural but some of the vendors that follow the midway company with the fair sell anime merch. Used to have an oscar the robot wandering around when I was a kid https://www.oscartherobot.com/ but doesn't seem to be one anymore so I might as well be the change I wanna see and make an animu version Biggest pain with my projects is designing bits to work exactly how I want to-I fly by the seat of my pants, making physical parts quick as possible so they can be tested IRL. But once past that phase building what you know is rather easy (mech parts, powerarmor, etc). Armor is the easy bit so I'm focusing on making it work, then protection. >>26368 >Stay safe in there! :D I already have a slight powerarmor injury from it closing around my ribs a little too tight, 2 years out just occasional slight twinge/soreness - rib injuries are a bitch to heal. Mech goal is to try to start grassroots giant robot fights(jumbo piloted rockem sockem)/giant robot first competitions. Like Megabots/Kuratas except, yknow, real and unscripted. I have a venue for the latter, I just need to rustle up competitors. Of course, as the weather turns I'll be focusing less on the mech and more on SPUD. Mech's list is to install new motor pods, shorten leg lift linkages by 2.5" (thats cutting 16 pieces of 1" squarestock and drilling 16 new holes!), install leg control computers and program walk sequence. Doubt I'll get that done by the time heavy snow sets in (see attached "mek doggo house.png"). Currently waiting on a shipment of flanges to mount the ASMC-04B 180kg servos to the 3d printed bits (see attached "chonky servoos.png") so I can get a start on the legs. Checked out David Browne, gave me some ideas how to upgrade SPUD's face even more (been thinking about switchin over to cloth eyelids anyway.
>>26380 >rib injuries are a bitch to heal. They sure are. Sorry to hear it bro, hopefully it'll get better! >Of course, as the weather turns I'll be focusing less on the mech and more on SPUD. Great! SPUD is much more pertinant and of interest to /robowaifu/'s goals. >the ASMC-04B 180kg servos Their form-factor reminds me of dear Elfdroid Sophie's shoulder servos. (>>10152, >>7284)
Open file (268.80 KB 816x612 servos smol.jpg)
Open file (774.23 KB 624x816 spud scale.png)
>>26382 I'll keep the mechposting to a minimum :) Originally started robowaifu development with "Carry", the emotive tool holder/workshop waifu... get it? She can carry tools lol. Functions were pretty basic like on-board speech recognition for simple things like raising arms, bowing and looking cute. Attached pic is her among her other mek siblings as of spring 2022. She's currently in a state of disassembly in the mechworkshop. Those elfdroid sophie shoulder servos do indeed look familiar! Was kinda tempted to put them in Spud's shoulders but I'll stick with my small ones for safety concerns. I want to make sure she is bottom-heavy so when she does (hopefully) walk she'll be less likely to tip over. Attached is Spud next to a RSMedia (for scale and also inspiration for the devkit I'll eventually release) and a closeup of the double-jointed elbow (inspired by some art)
Open file (183.04 KB 1280x790 real_steel.jpg)
>>26384 >I'll keep the mechposting to a minimum :) No worries, it's amazing stuff. Good luck with starting a league competition system. Soon life mimics art. > We're simply more about 'make love, not war' here is all. :D >the double-jointed elbow Nice design. I actually immediately thought it was a knee from the thumbnails hehe. I'd like to use a design similar, but with rubber bumpers+some kind of elastomeric suspension at the joints. Keep moving forward this Winter, Mechnomancer. :^)
Open file (1.41 MB 735x955 servoknees.png)
>>26386 >We're simply more about 'make love, not war' here is all. :D To each their own, but I'd rather be a warrior in a garden than a gardener in a war (eg hope for the best, plan for the worst). Protecc the waifu with your laifu XD Joint was initially designed as a knee (see attached), but experimenting proved the servo package isn't really powerful enuff to move legs of such size (at least without bulky gears). Working on replacing the 2 eyelid MG995 servos with 4 SG90s (I also need to steal a few MG995 circuit board for the mech leg motor controllers). This way a cloth eyelid can be stretched over the lcd screen, slide along the surface and have minimal gap between eye & face surface. Plus eyelids could be set to angles to give extra angry/sad expression.
>>26387 >Protecc the waifu It's a fair point : (>>10000). But as I mentioned before in another thread, I don't want /robowaifu/ to devolve into 'Terminatorz-R-Us' . I'm quite well-aware of the military connotations here, but this isn't really the proper forum for such developments. >This way a cloth eyelid can be stretched over the lcd screen, slide along the surface and have minimal gap between eye & face surface. >Plus eyelids could be set to angles to give extra angry/sad expression. Excellent. The eyes are 'the window to the soul', or so the old adage goes. You're already well along the way to highly-flexible eyelids simply by dint of choosing cloth materials. I'd suggest you confer vids-related : (>>1392) Anon; a very clean & simple concept for further warping of the cloth lids during emotive expressions, &tc. >=== -prose edit
Edited last time by Chobitsu on 11/13/2023 (Mon) 18:15:28.
Open file (244.00 KB 337x587 hearts.png)
>>26388 > I'd suggest you confer vids-related (>>1392) for a very clean concept for further warping the cloth lids during emotive expressions, Anon. Heh I've seen ST-01 and kinda stole the eyebrows. The eyelids in the vid seem to have a bit of a spasm. If unintentional, that's usually caused by the servos either being under strain (approaching stall) or the signal line picking up electrical noise from somewhere (typically a power cable). Not a big deal with little robots, but I'm a little traumatized from that happening with my larger projects enough to avoid it like the plague. I'll only mention my larger robot projects if relevant/affecting to development of SPUD, such as the above. Spud is built to be fren :) Something else I'll be looking into is having 2 different eye icons at once, so spud could be cross-eyed or have asymmetrical eye icons -eye screens are currently wired in parallel. Could be simple as modifying the library to change the Chip Select pin, have to experiment a bit.
>>26390 how much was the material cost for this roughly?
>>26390 >but I'm a little traumatized from that happening with my larger projects enough to avoid it like the plague. Twisted, shielded pair is the standard signalling goto in EM-noisy environments. If that's not enough, go with electro-optical transducers + fiber. >Spud is built to be fren :) Even everyday robowaifus will face issues with this. And of course the more complex/evolved they become, the more likely. There is also the potential for unwanted, exterior noise to block. I may say that all our 'breadbox' (a cooled, electronics housing system within the upper thorax) designs are Faraday cages as well. >Could be simple as modifying the library to change the Chip Select pin, have to experiment a bit. My animator's mindset compels me to think 'the sky's the limit' when discussing screen faces/eyes. :^) >=== -add 'Faraday cages' cmnt -minor edit
Edited last time by Chobitsu on 11/13/2023 (Mon) 19:09:45.
>>26391 Roughly $500. Most expensive bits are the 5x 70kg servos and the pi. I do have a tendency to overengineer/large factor of safety (holdovers from experience with my larger projects) so you might be able to get way with cheaper servos. >>26393 >There is also the potential for unwanted, exterior noise. So far Spud has experienced no motor noise, although I've had to separate the loops into separate threads to get smoothed motions (thread for face detection, thread for determining movement, thread for implementing movement). I've heard some autistic detail about the python "threading" library, but don't really care so long as it makes it run faster. I'm like a 40k ork in that way. I'll try to get some footage of the face/eye tracking stuff later today, it's fun to see the 'bot's eyes following a face around the screen :D
>>26395 >I'm like a 40k ork in that way. Heh, you have no idea. This type of stuff will prove to be deepest rabbit-hole of them all, before we all push across the finish line together in the end. >t. systems-engineering wannabe. :^) >I'll try to get some footage of the face/eye tracking stuff later today Looking forward to that Anon. Cheers.
Open file (3.52 MB 1801x903 spiny.png)
I forgot that I disassembled SPUD's stand while testing the ASMC-04Bs -give these babies enough power and they can ZOOOM-, so I did a little more prep for the upcoming shipment of servo flanges by doing a lower body/spine transplant. Good-bye fun-noodle spine and heavy mech-grade actuators! The process reminded me of the intro to Battle Angel Alita, schleping the half-assembled torso around. I'll save you the messy details: results are attached. The rib cage connects in the front, with tension of the ribcage holds the bolts in place eliminating the need for nuts (other machine screws you may see self-tap). Plenty of space in the abdomen to stuff a tool battery & charger/ac adapter. I'd love to figure out a system to seamlessly switch between the two, but rebooting SPUD to switch between powermodes wouldn't be too bad I guess. I took in mind the structure of the motors so they make the hips/upper thighs flair out from the pelvis, however I think the hips are a bit too wide: over double the width of the ribcage! Already got some ideas to reduce the width like rotating the pelvic ASMC-04Bs vertical, recessing the flange on the pelvic servo linkage, and increasing the length of the keyway notch on the hip motors all the way down the shaft. These things are like $50 each, so I'll practice on the one I accidentally broke rather than risk breaking more of them (legs are gonna be a total of 9 of these beasties, over the past 6 months I got 7) I just need to go over the wiring with a fine-tooth comb to ensure nothing went hinky before turning SPUD on again. My mech once caught on fire (don't worry it was small enough to blow out) and that is an experience I want to avoid repeating.
Open file (48.66 KB 321x826 StandardFranxx.png)
>>26345 >Monkey Surprised this hasn't come up before, please make a monkey/mocap thread! >>26356 Thanks for the specs! >>26384 >Genista Nice! Personally want to clang a standard Franxx. >>26393 >Twisted pair Beat me to it. I'll add that you can use Cat5 cable for data lines. Wrapping wires in copper/aluminum tape is a cheap alternative.
>>26401 >>>Monkey >Surprised this hasn't come up before, please make a monkey/mocap thread! It did come up, or at least something similar, but just briefly. James Bruton made a video about it, or even two. I can't find the posting here on the board, since I can't compile Waifusearch. Anyways, I even bought the electronics for doing that. Problem is, no on here has an actual robot to control. >DMX controllers (I always forget the name) https://www.youtu.be/6TAfDX1u7w0 https://www.youtu.be/diVXbuRislM
>>26401 >please make a monkey/mocap thread! A Robowaifu Motion Capture/Motion Control thread would indeed be a good idea Kiwi. I'll think about what might make a good thread related to our specific needs. >Cat5 cable Yep, it's not only a widely-used standard interface, it also has 4 separate, balanced data channels per cable. Good thinking. >>26404 >since I can't compile Waifusearch. Huh? I thought you were using it regularly Anon. Is there something I can do to help? >DMX Yes it's a widespread C&C protocol, especially for stage lighting, etc. It's rather heavy for our internal, onboard robowaifu uses IMO, but it's a reasonable consideration (at least for Monkey captures, etc.) >le ebin blackpill Lol. I hope you get encouraged soon NoidoDev. We all count on you here! Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 11/14/2023 (Tue) 14:20:57.
>>26399 >Good-bye fun-noodle spine and heavy mech-grade actuators! Lol. The old spine does look pretty cool Anon. :^) >with tension of the ribcage holds the bolts in place eliminating the need for nuts I'd consider the ability to maintain good physical integrity despite mechanical orientation or dynamics to be a high-priority for our robowaifus, Mechnomancer. Do you not see things that way? > cf. Tensegrity < of course, I personally have this wild idea that our robowaifus will be cabable of world-class Parkour exploits, once we solve all the strength+power-to-weight issues effectively enough, so there's that... :DD Maybe I've watched too much animu/scifi? :^) >Already got some ideas to reduce the width Yes. Anything we can all do to reduce the volume necessary for internals, the better things will work out for everyone here & elsewhere. Good luck with your research along this line Anon! >My mech once caught on fire (don't worry it was small enough to blow out) and that is an experience I want to avoid repeating. Safety first! :^) >=== -minor edit
Edited last time by Chobitsu on 11/14/2023 (Tue) 14:47:40.
>>26405 I have it on my Raspi but not my PC (on a external disk which isn't connected to the Raspi right now). Trying to install it on my PC didn't work when I tried some hours ago. I might post something in the related thread: https://alogs.space/robowaifu/res/12.html#8678
Open file (93.05 KB 804x604 rsmedia.jpg)
>>26405 >Robowaifu Motion Capture/Motion Control As I understand it Kibo-chan uses the Blender Servo Animation library. This allows for bone rotation in blender to be translated to servo id/pwm signal and feeds it to an attached arduino, allowing the robot to be animated just like a videogame character. Sauce: https://github.com/timhendriks93/blender-servo-animation-arduino I haven't tried attaching the arduino, I modified a companion plugin from exporting a highly formatted mess to a list of servo positions per frame. unmodified plugin here: https://github.com/timhendriks93/blender-servo-animation#readme I'm aiming for similar flexibility as the RSMedia's development kit (see attached), which allowed custom movement/sounds to be uploaded to a special Robosapien V2.
>>26420 Very nice information Mechnomancer, thanks kindly! I'll plan to add these links into the thread's OP. Cheers.
Open file (1.11 MB 633x848 spoodsit.png)
Got spud onto the old display stand, inspected wiring and got the idling program running. However opencv stopped recognizing the usb webcam but is still recognized by cheese. So I'm reinstalling opencv via pip, and if that doesn't work I'll try apt-get.
Open file (363.49 KB 462x517 PaperOtter.png)
>>26437 Is that upscaled papercraft hair?
Open file (1.25 MB 884x970 motahs.png)
>>26438 >papercraft Yup. Pepakura is good way to check part proportions for relatively cheap. I also trace parts from paper onto 26 gauge steel/pop rivet the tabs together for the more complicated armory-y bits of my larger projects. In comparison, SPUD is much easier than those (no need to schlep around 100lb parts lol). Found out the issue with the webcam. After reinstalling Opencv (a long process) I found the default webcam encoder for Opencv -called "GStreamer"- has issues with its latest version, so I wrangled it into using the older (?) Fmpeg encoder, so not only does it work but it runs faster. Gstreamer still throws errors but it doesn't stop the code from running so I consider that a win. Narrower hip servo mount finished printing after 13 hours/50 meters of filament, so Spud's hips should now be a bit less Willendorf. New top piece makes 16 inches at the widest point on the hips/thighs, as opposed to approximately 22 inches for the older lower one. So, just gotta tweak the code for the new encoding method and I should have the demo of the face/eye tracking hopefully tonight. Also show off the papercraft... uh... chest panel .
>>26437 Sound like your prep work is progressing apace, Anon. BTW, there are dedicated smol cams that literally have Linux + full install of OpenCV directly onboard the camera. Makes this style of integration/maintenance close to a non-issue as long as you leave them dedicated solely to their task at hand (JeVois cameras, >>1110). http://www.jevois.org/
>>26441 >Pepakura is good way to check part proportions for incredibly cheap. FTFY :^) I think it's an amazingly-effective approach to many issues we face here. I've even used paper as a direct part of practical structural supports intended for real, highly-inexpensive (read: Everyman's) robowaifu kits. Great progress with this interim goal of volume consolidation. Of course, closer-quarters imply higher heat concentrations, and therefore may indicate some type of passive/active cooling locally. Cheers Mechnomancer. :^)
>>26444 >$300 camera Unless I experience performance issues, Ima stick with single-board computing/webcam for the robowaifu except for LLM/speech recognition. Keeps it simpler and costs down. When I factor costs, I also include time. I had to re-write all the ambient movement code because reinstalling opencv broke it somewhere (or I'm just really bad at coding), but either way here's a close-up of the face wandering around and the eyes tracking my face every few seconds (using a haar cascade, still have yet to get the machine learning library running so it can recognize individual faces). Also lowered the eyelids a bid so they obscure the top of the iris, didn't have time to put on the papercraft vest/chest, might wait until I build the legs (servo flanges should be arriving tonight).
>>26463 Notice: You've doxxed yourself in your video, Mechnomancer Our policy up till now has been simply to delete anything that's personally-identifying, if it legitimately can be construed to be an accident. OTOH, perhaps this has been a bit overbearing, so in this case I'll leave it up to you: I recommend we delete your video, and you can reupload an edited/censored version if you wish. >the ambient movement code That looks nice Anon. Nice subtleties. >servo flanges should be arriving tonight Looking forward to the next updates. Cheers.
Open file (3.06 MB 2251x827 hipbiece.png)
>>26466 Oh I'm active in several online communities, many are more contentious than this one :) but if you'd prefer I can delete it and use my wyatt earp action figure for a face recognition demo instead. Besides if one aims to sell robowaifu kits digital or otherwise that's gonna leave digital foot prints Hip-piece servo hubs have been completed with a knuckle for the secondary leg parallelogram linkage. Just need to 3d print more bits.
>>26468 >but if you'd prefer I can delete it and use my wyatt earp action figure If you would, thanks. I'll rm your vid this time for you. You're on your own hereafter unless you point out a mistake to us. >Besides if one aims to sell robowaifu kits digital or otherwise that's gonna leave digital foot prints That's yet to be determined. In the meantime, every anon here should exercise due diligence. Generally-speaking, Leftists can be quite vicious & demonic in my experience. Typical Filthy Commie stuff. Just use commonsense Anon. Remember, the very mild opposition we are facing today is as nothing compared to what is coming once the Globohomo sees their favorite pet feminism begin crumbling down as a result of widespread adoption of robowaifus by men the world over. >Just need to 3d print more bits. Looking good Mechnomancer! Eagerly waiting on the next updates!
Open file (1.67 MB 914x897 spuddy.png)
Need to re-print some parts as the pvc mounting points need further reinforcement, extend the ankle joint adapters and make the shins 6" longer. Even so, SPUD kinda stands.
>>26497 Excellent! Nice work today Mechnomancer. The hips rework is a nice improvement.
Open file (842.07 KB 603x793 spudpanel.png)
While waiting for more stable/reinforced ankle joints to print I applied some papercraft. Torso seems a bit long and shins too short, but since they're connected with pvc fiddling with the proportions shouldn't be too bad. Once I get ankle joints printed I'll be installing the gyroscope and powering up all the leg servos to test this neat simple bit of code I cooked up for making it balance.
>>26576 Nice work on the boobs, I love the early years lara croft look.
>>26577 Thanks, final panels will probably be laying cloth onto a form and painting it with acrylic/latex paint to create the shape, then foam backing. The chest is more annoying than you'd think, because I have to reach around 'em to adjust stuff :/ SPUD is much higher poly than Tomb Raider 1 Lara tho lol
>>26576 It's an interesting effect of psychology that even a crude approximation of a robowaifu's shell makes her feel more 'alive'. Nice work Anon! Keep up the good work. :^) >>26578 >laying cloth onto a form and painting it with acrylic/latex paint to create the shape, then foam backing. Sounds like a great idea.
Open file (722.96 KB 796x900 render paper.png)
Putting all the 3d models together into a single file to check proportions and whatnot. Made a new chest panel specifically fitted to the model, and working on a new hip piece. Structure of the shins is actually thin enuff I might just remove the gundam panels and instead give it human-scale shins with big chunky boots to hide the ankle joint/big feet (and probably put batteries in the feet to be more bottom-heavy). Also need to get around to designing/replacing the wriggly ears with those chobit-style ports for the HDMI/usb hub.
>>26597 Excellent! Looking forward to seeing the new additions, Mechnomancer. Cheers. :^)
>>26597 The absolute madman!
Open file (4.79 MB 480x854 face_v1.mp4)
Did the papercraft overhaul (more on that later) and working on implementing a locally hosted object recognition AI model. It will inject what it "sees" into the LLM prompt before sending it off to a personal server. 2 Tiers of objects: static objects such as "bookshelf", "Entertainment center" or "toyshop" and dynamic objects like "microphone", "spotlight", "cup" and "person" with eye-tracking on the most prominent object or any person. I'd show that off, however deploying it on a Pi is more annoying than windows, so in the meantime I tested out the cloth face using a manual servo testing board. Its a little rough -eyelids are dark grey and rather recessed so on cam they show up almost black and the cloth isn't entirely smooth- but it does prove the concept. Gonna do a complete soft-face overhaul so the eyelidss will be like the cheeks, nice and stretchy, and not so sunken.
>>27098 It looks creepy as fuck.
>>27098 Neat! You're making good progress with the head. We've talked here about impregnating a silicone-like material into a clothlike outer shell. Any chance you'll be experimenting with that approach. I see you've in fact begun mocking the Chobits earpods. Nice! :^) >Gonna do a complete soft-face overhaul so the eyelidss will be like the cheeks, nice and stretchy, and not so sunken. Looking forward to seeing that Mechnomancer. Keep up the great work!
>>27098 >what it "sees" where did you put the cameras, thought the eyes were just screens
Open file (391.00 KB 965x828 aigis_red_tie.png)
Open file (1.28 MB 1200x680 ClipboardImage.png)
>>26576 >Papercraft Based, would recommend laminating it or coating it with a silicone. >Blonde, red tie Just like my wife! >>27098 Please, don't use a cloth face. It works a treat for everything else. A cloth face looks like a Mondasian Cyberman every time no matter what. Face development is truly difficult. You'll get there eventually. Your Chobit ears are cute!
>>27176 Its at an intermediate stage right now, plan to color the cloth with some fleshtone acrylic paint. Still need to migrate the connectors into the chobit ears, but I just recently got a shipment of filament so I should be able to get started making those out of PLA and overhauling the face so it looks less possessed when the eyes are closed. If I do experiment with silicone faces it will be in the springtime, as I have no workspace I could safely ventilate that is also heated. Chicks dig scars but no one can see scars in your lungs so they don't do anyone any good.
>>27098 >>27204 >so it looks less possessed when the eyes are closed. Haha. Not sure if we discussed the mouth articulation issue ITT or not, but the topic came up with the earlier Elfdroid Sofie threads. The >tl;dr is basically that without a significant (read: expensive) design & engineering effort, it's rather unlikely to be a pleasing outcome to go for a single, flexible 'skin'. The choice that SophieDev came down on was a 'Nutcracker'-like mouth. Have you considered that approach yet, Mechnomancer? >chobit ears Deffo looking forward to seeing this approach used! :^)
Open file (196.14 KB 352x303 face strings.png)
>>27220 >mouth articulation be hard yo strings/cables running from around the mouth to various points on the head to move the mouth corners(the green on the pic, shooped in eyes so I stop scaring the 'technician) -even gently poking the fabric at the mounting points make a big difference. Lots of the fabric messiness around the mouth can be fixed by using properly sharp scissors while cutting, and putting a few layers of acrylic around the edges o3o Soon I'll be deploying oobabooga onto the raspi and getting the TTS connected (Espeak sounds robotic but it does real-time even on low-end systems). Might go for a hardware rather than software solution for moving the mouth up/down: an LED equalizer next to the output speaker to pick up the speech with a wire from one of the LEDS heading straight into a GPIO pin, which is only listened to while SPUD is talking.
>>27223 Yes, that might be a workable idea Anon. Especially if you secured a couple of orbicularis rings around the lips before the zygomaticus retractors. Looking forward to seeing what you manage with this issue.
Open file (2.97 MB 426x240 ruff_face.mp4)
Going thru my vidya archives I just remembered when I built this little fellow. Might try to scale it up and make a papercraft face while waiting for parts to print.
>>27251 That's charming Mechnomancer. Totally-not-Kibo-chan is a good example that we can achieve nice, empathetic characters without delving deep into the mines of the Uncanny Valley.
Open file (3.97 MB 480x596 kami-chan test.mp4)
>>27254 I call it a "kami-chan" model of the SPUD line, "kami" spelled with the kanji for paper (紙ちゃん). I kitbashed a few papercrafts - scaling them up- and threw together a larger version. Things are a little crooked due to how the layers are mounted but the idea is there.
>>27289 Very nice, Anon. I'm planning to use a papercraft shell for my initial robowaifu prototypes (shell durability isn't a high-priority for me at the initial stage, but rather rapid design changes/mockups, &tc.) BTW, have you looked into software that can do 'unfolding' for you? That is, it can take (say) a Blender 3D model, and then give you 'cut' lines to make (exactly like marking edges hard/soft) with the goal of making flat prints (like Kiwi's mesh flats : >>26066, >>26153, ...), which can then be directly folded/assembled into a 3D-shape. This idea is very common in the papercraft world ofc, but it also works for 3D printing. >=== -minor edit
Edited last time by Chobitsu on 12/15/2023 (Fri) 10:27:58.
Open file (669.61 KB 523x891 fullbody.png)
Open file (10.33 MB 1080x1920 Data.mp4)
>>27290 The software you're describing is pepakura (other, more elaborate programs similar in nature exist for designing sewing patterns), which has been in use by diy communities (my experience is designing cosplay props/armor) for over a decade. Using pepakura is how I made Spuds paneling. I need to get around to making some better hair tho :) Pepakura costs like $45 for a registration key, but even with the unregistered version you can unfold 3d models and print to pdf, just can't save editable files. I finally managed to get mediapipe running on the raspberry pi so it can track me while my face is hidden (good thing? bad thing?), and have a combo of 2 object recognition libraries running (one with objects and the other general image recognition). The servohats have a tendency to get unplugged from their power source so I gotta fix that before SPUD can move again. Also hacked a cheap led "equalizer" into a gpio input for the raspberry pi. The equalizer hangs out around the bluetooth speaker and sends the signal to open/close SPUD's mouth according to noise level (have yet to connect it to the mouth servo just have a print script running atm).
>>27438 >pepakura Great! So you are already aware of the idea then. I just wanted to make sure you weren't overlooking something to make things easier for you! :^) >(good thing? bad thing?) <le ebin anonymoose Sure, that should be sufficient to keep outsider amateurs at bay, which at the moment are the primary concerns. You'd need OG opsec to deal with the pros (full bodysuit, blurs, obfuscation, etc.) but I think this is sufficient for this point in time during Current Year -era, Anon. >The servohats have a tendency to get unplugged from their power source so I gotta fix that before SPUD can move again. Huh, I wonder why? >The equalizer hangs out around the bluetooth speaker and sends the signal to open/close SPUD's mouth according to noise level Pretty sure this was the route SophieDev chose as well (or something functionally similar). That fullbody is looking better Anon! Looking forward to your continued progress with SPUD, Mechnomancer. Cheers. :^)
>>27438 >zooms in >Gigachad 0.9982279331909
Open file (3.36 MB 1080x1920 Spuds Merry Christmas.mp4)
>>27462 The pca9685 servohat is a goofy little beastie that has problems with the official way to power it. So I came up with some alternatives (plugging in power where a servo should go). So far with WW 2.41 everything is on-board: no reliance on third-party stuff on servers located in parts unknown (I was working with chatgpt starting in feb and witnessed firsthand the neutering thereof over the course of several months). Until I either get folks to help translate the pocketsphinx library out of its hard boston accent or find another locally hosted speech-to-text engine I'll be using generic google speech-to-text. Spud wishes you all a Merry Christmas, and she can even "say so" herself lmao (complete with horrible anime-style lip synch). I'm working on a revision to the face to close the gap between eye screens and face surface to less than 1cm, and subsequent proper adjustments to the mouth to make it more fluid.
>>27496 Excellent progress, Mechnomancer. Nice work! Merry Christmas to you too bro. Please stop by the /christmas/ festival this weekend. Good luck with the further Spud design improvements. Cheers, Anon. :^)
Open file (892.33 KB 1920x1080 audio.mp4)
>>27497 Thanks. I appreciate the invite but I'm going outta town for Christmas. Yknow, normie type stuff lol. I just cobbled together a better TTS engine client/server script tested on localhost (audio gen took 1.3 seconds using the AI model on an Nvidia GTX 2080). Deployment of the client onto the raspi (and server config) will probably have to wait until after I return from christmas but the voice sounds much better. ".wav" files are verboten so I just plopped the audio onto a quick little vid.
Open file (723.37 KB 1024x768 chobits.jpg)
>>27513 Excellent. Yes, that's much better. May you have a Merry Christmas holiday with the fam and whatnot, Mechnomancer. Looking forward to your safe return with us here on /robowaifu/ . Cheers. :^)
Very impressive what can i say. I dont know if youd like to collaborate seems like you dont need any help but if i could take a peek at the files please. Good job Wish i could say i did anything special for christmas but for the most part it was just another day :/ You should consider getting some funding by the way like seriously. Like kickstarter atleast. It does look kind of creepy though. Maybe i could help with the skin? Idk
Open file (1.18 MB 3264x2448 Spud New Face.jpg)
Open file (2.02 MB 3264x2448 Spud Current face.jpg)
>>27613 Thanks. I'm not comfortable releasing the files tho until I have some documentation to go with it -or at least a version I am happy with. I hold myself to (what I consider) high standards, probably higher than what is good (or profitable) for me. The current eyelid mechanism turned out a little wonkier than expected. Ended up with over 3cm between the eyelid and the face surface to get the eyelids to blink, giving it that FNAF eyeball look (major parallax distortion when not viewed head-on, can barely see the eyescreen frame-left in "Spud Current face.jpg"). But I'm working on a new mechanism to reduce the distance to less than 1 cm and have cloth eyelids slide linearly over the screens "Spud New Face.jpg" and using a ripped mmd 3d model for reference. Cleaning up the sides of the eye openings might be a challenge, but the fabric also tends to curl inward so might make it easier. Best option to have the avatar (robowaifu body) connect to the host LLM computer will probably be via a LogmeinHamachi network (saves issues with port/ip silliness). PytorchAudio might work for locally hosted speech recognition, have to test it on my main computer then attempt pi deployment.
>>27638 Good luck with the eyes, Mechnomancer. I'm sure you'll figure it out and again I recommend you glance over SophieDev's work. Might get some inspiration there. >Best option to have the avatar (robowaifu body) connect to the host LLM computer will probably be via a LogmeinHamachi network (saves issues with port/ip silliness). So is this a local network (server in your own house), Anon? Some type of elaborate network hole-punching technique for such a setup is both overkill, much too complex in it's attack surface, and: < LogMeIn ...lol no. The le 'Proud Wakandan / Stronk Independynt White Tranny' pics on their front page is not a good look for anons. Once such a company discovered that you are using it's products for running cute & affectionate robowaifus of all things ("REEEEEE you're killing Feminism!111one!!") -- they would immediately cancel you, and jack-booted GH Big Police-State thugs would likely soon be kicking your front door in for an ensuing struggle session :^). Just use simple static IPs and problem solved (and this could easily be automated via scripting). Cheers, Anon. :^) >=== -prose edit
Edited last time by Chobitsu on 12/27/2023 (Wed) 02:59:57.
Open file (38.37 KB 930x475 workflow.png)
Using Pytorch for Speech recognition is more of a pain than it is worth (some wonky thing about requiring a version of ffmpeg that isn't dependent on python to capture streaming mic audio or something) so I found Vosk to be a good alternative. Locally hosted with a 40mb model and purportedly works for Raspberry Pi! Voice commands for a raspberry pi... *glances at mek and powerarmor with malignant intent* Workflow is almost complete. Just need to add the TTS ai into the mix then split it into the Server/Avatar scripts. Separating the emotes out of Oobaboogas response is nothing fancy, just some annoyingly involved string manipulation (replacing asterisks with parentheses to enclose the emotive language is harder than it sounds!). It is kinda fun talking to it, like magic seeing the words I say appear, then a response - I've read a lot of vintage scifi as a kid so I have an unusual appreciation for this sort of thing (Asimov eat your heart out). However my voice is low enough vosk model sometimes misunderstands, but even google speech has some trouble with it (and smart TVs do not understand/hear me at all) so I'm not surprised. AI gets kinda confused, too. XD Talking in a higher pitch seems to solve it. Tempted to use SPUD to apply to a local makerspace "startup bootcamp program", market it an artificial assistant/greeter. So like one of these, but more sophisticated/owo. https://www.youtube.com/shorts/UiYeUq-rAJk Companies would probably like some artificial secretaries/greeters that can not only answer questions but also keep their data private. You do not sail straight into the wind, you tack :)
>>27658 >Workflow is almost complete. Sound good! Good luck with this subsystem's progress Anon. Cheers.
>>27662 Thanks, I'm thinking about doing a first public release of a few simple scripts that lets folks talk to the oobabooga LLM via VOSK speech detection, and having it talk back with the TTS (like the workflow except on 1 computer). There seems to be zero documentation or examples about combining all these AI together (speech detection/LLM/TTS) so I might as well be the first. If I'm feeling fancy maybe display a pic of a waifu avatar with mouthflaps :D
>>27687 I kind of did something related to that but it was for a waifu api thats now dead https://github.com/peteblank/waifu-conversation
>>27688 Lmao wtf get your waifu api for $999,999 a month hahaha
>>27688 >ded >>27689 >crazy cost Yeah that's why I'm looking into locally hosted stuff. Free from corporate overlords and their predatory pricing/nerfy-updates. Worst case scenario I upload my current version of oobabooga/models on github w/ installation instructions.
>>27690 Is anybody paying that amount lmao Id be willing to host an api at a much more reasonable price. Say $888,888 a month.
Well, now I can talk to the LLM all on my mid-tier gaming computer (next step is to distribute some of the load onto the Raspberry Pi). The AI's voice has some reverb/echo cuz my mic is also picking it up. I need to tune the character cuz it seems a little weird, though.
>>27724 Excellent! The response latency is nearly negligible at this point Anon. You definitely have the right idea trying to economize the backend hardware involved. And of course the 'frontend' (onboard) hardware will need to be extremely efficient in Flops/Watt to remain viable for low-end (read: cheap) robowaifu designs. Thanks for the update, Mechnomancer. Keep.Moving.Forward. Cheers. :^)
Oh i see that you posted an earlier model on the other thread. Youre almost there yeah if you can make her look nicer... But again amazing work.
>>27223 Really I think that may be the main thing. The mouth... The nutcracker sophiedev did also looks kind of weird. But if you combine the nutcracker with a mask or something that might make it look better I think.
Open file (6.41 MB 1364x768 expressionschatgpt.mp4)
>>27724 Going over the code, I can only conclude the LLM seems insane because it felt like emoting crucial elements (describing actions between asterisks), which I removed from the response so it didn't clog up the TTS. For diagnostic purposes I added it to the the printout (but not sent to the TTS) It is pretty easy to detect words in a string then emote accordingly (assuming you have the graphics, etc), here is a vid of me doing just that with chatgpt early last year. The slow nerfing of chatgpt made me sad. A basic sample of detecting emoting content (with python) if "happy" in response: print("happy expression") #or whatever code you want to do to express happiness. Chatpgt is fine with working with limited responses for emoting. Locally hosted (& less sophisticated) LLMs can too with the proper prompt, but they do sometimes come up with words outside of the (admittedly small) list. That's what I got all you for, to come up with lists of possible emotes lol
>>27973 Very interesting progress Anon. We do have an Emotions thread : (>>17027) if you really want to open up a bigger, general discussion here on this topic. I'd certainly recommend we do so as a good way to kick off the New Year. And of course, this area is closely-related to Cognition as well : (>>24783) . And since emotions are often a deep part of social engagement, maybe the Society thread can provide some good context for your work : (>>106) , as well as our Personality thread : (>>18) . Regardless, as you point out LLMs have very distinct limitations (even if you have multi-billion dollar data centers, as our common enemies do), and I consider some of them fundamental. This is why we here are seeking for another solution in our Cognition thread; one that will run fully opensource & disconnected, on Anon's local hardware. >That's what I got all you for, to come up with lists of possible emotes lol Heh we're all in this together, Mechnomancer! Cheers. :^) >=== -minor edit -add'l crosslink
Edited last time by Chobitsu on 01/03/2024 (Wed) 07:09:26.
>>27978 Okay... what do you mean by AI? Do you mean you wish to make a language model now? I'm going to use something like llama2 that is not a problem for me. If you wanted to help you'd make a movement AI and you'd set goals on what is to be carried out. If you wish to make a language model you need goals too.
>>27978 >Open Source and locally hosted LLMs I present Ooobabooga, a locally hosted web UI for LLMS with aims to be the stable diffusion of text generation. The OpenChat model is comparable to Chatgpt before the great Nerfing (and about 16gb), but Pygmalion (8gb)is ok too. https://github.com/oobabooga/text-generation-webui >>27979 Already got an LLM from hugging face and the ability to create animations for the physical body using blender (see first attachment in first post)
>>27289 Oh wow I missed that one. that one also has mechanical eyes instead of screen eyes. I don't know if mechanical or screen eyes are the way to go. I think mechanical in my opinion. >>27980 I'm going to make the skin for the face with TPU filament. I also made a stand to hold the face in place. I might have overdone the stand its almost as tall as the 3d printer(ender 3) the filament is black however. I might have to paint it unless we want blackfu. It'd also be nice for the hand. However for the hand for example I'm wondering if the tpu skin would be not flexible enough for the finger movements...
Open file (231.64 KB 1280x720 IMG_20240103_203104.jpg)
>>27981 It actually didn't come out as tall the 3d printer. I've been thinking my ol' ender 3 is good enough but not really. The more I print on it the more i notice its flaws. Look at this gear for example. Although I did set the quality to standard but still.
Open file (6.63 MB 1080x1920 New_Year.mp4)
Need to adjust the noise level detection a bit for the mouth. Not quite yet ready to deploy the client-server setup as I'm trying to get a better TTS engine running. Emotivoice has not only TTS but emotional intonation that can be locally hosted. I just need to figure out how to use "Docker" and hopefully I have the power to run both at once without much computational delay. If I could figure out how to manipulate .wav files -adding audio effects- via python that would be nice but there doesn't seem to be much out there.
>>28031 Did you change the mouth mechanism? Nice i think it goes up and down now right? I had this idea of the skin being tpu filament. You're using something fuzzy... Yeah if you can get the skin to look somewhat like a sex doll and put on a wig it'd look nice-r in my opinion.
Open file (50.74 KB 200x200 15907.png)
>>28033 Mechnomancer. I appreciate what you're doing however if you want to sell it I do believe that people are quiet interested in the sexbot aspect and it looking nice-r. Now whenever somebody says something can not be sold I point to the token known as obamaharrypottersonic10inu I think it was called. Now if that can be sold anything can be sold.
>>28034 As I mentioned earlier in this thread, I am focusing on a nice solid frame that can handle further... infrastructure. Furthermore, I also previously mentioned the current face is a proof-of-concept and may experiment with casting silicone & such later this year when weather permits. There are plenty of the "fun bits" available pre-fab if you know where to look (*cough* spamazon *cough*), such as anatomically correct full-body suits for "cosplay" or a simple morphsuit if you're strapped for cash.
>>28031 Nice progress Anon. The shell (fabric?) skin is coming along. Do you have further plans for more-closely lip-syncing the mouth with audio? Cute touch with the Santa hat BTW. Cheers. :^)
Open file (399.47 KB 1217x443 earz.png)
>>28044 >le lip synch The mouth flapping is controlled by what was originally an LED equalizer box connected to a GPIO pin. It has a little knob to adjust the levels, which is what I need to fiddle with. Working on 3d printing chobit-style ears (one for ports and another for webcam), need to re-do a piece as I miscalculated the gap for the HDMI port.
>>28053 Mechnomancer are you from wisconsin? This site might be wisconsin supremacists... I will say however. Poppy and inmoov creators come from france. I am not a frenchy supremacist however i do notice they contribute to this robotics stuff often though.
>>28055 Well I REALLY better get going i'll be seeing you on your youtube channel I guess. There were some disagreements here i guess.
Open file (641.00 KB 1280x919 chii_ears.png)
>>28053 >chobit-style ears Really looking forward to it.
Open file (3.84 MB 854x480 nodata.mp4)
>>28053 >I was so excited to receive my first persocom 25 years after reading Chobits but imagine my disappointment when I found out she didn't even come with an RCA cable! Now I have this futuristic piece of tech just sitting there telling me to plug her into a HDMI port to continue installation, probably judging me for owning a CRT. Don't waste your money. Wait until someone makes one that isn't just for kids who never read the manga.
Open file (13.88 KB 351x356 i_was_like_JUST.jpg)
>>28067 >mfw
Open file (911.60 KB 500x350 booma.gif)
>>28067 tfw you forget HDMI to rca converters exist :)
Open file (6.85 MB 960x540 newface_step1.mp4)
New faceplate installed: gap between eyescreens is reduced by 10mm but with a few tweaks I will probably be able to squeeze it closer by another 5mm before attaching the cloth screen w/flexible cheeks & eyelids. I'll probably need to re-print the chobit ears, as I didn't put screws in the tips to secure the two halves together, so there is a gap of about 3mm. The hole for the webcam is also slightly too small as to the outer diameter, so Spud's view of the world looks like seeing thru a porthole. Also slightly increased iris size (don't know how it will look with the cloth face covering it but it is an easy tweak). I'm thinking of using cloth pieces to build torso musculature in layers (such as in the below video) to put between the pla frame and any external (morphsuit) covering. Probably won't work out (look too uncanny) but it would be fun to try and not too expensive: walmart has cloth bolt ends of like 2m for $2-$4. https://www.youtube.com/watch?v=tGgwA7IY0hY
>>28131 >muscles Costumers use shaped foam to create definition and musculature in costumes, you're doing the same thing here, get a thick foam block and carve muscles from that.
>>28131 Seems like a silly question now, but why fixate on physical eyelids? It would probably be way easier to closely fit these "eye" screens to the faceplate and have them play some stylized blinking interruption rather than trying to add physical eyelids, so I am assuming it's more of a personal "why not" ?
>>28135 I think the eyelids are important. If you dont believe me watch lucky star.
>>28135 In earlier versions I did have a blinking animation on the LCDs, however when the eye-screens are off it looks possessed (ಠoಠ)و† Plus its more difficult and I love me a challenge. I could do nonlcd eyes (doll eyes, etc) but I'm enamored with symbol irises. I've seen some work of embedding them in physical/animatronic eyeballs so eventually that might be an option.
Open file (16.93 KB 341x192 5034-329920408.jpg)
screen face is for karens
>>28131 Shaping EVA foam for the volume and putting cloth over it would look and feel good while keeping the weight minimal.
>>28131 Nice progress, Anon. It's a cute addition! :^) >I'll probably need to re-print the chobit ears Ehh, you'll get it sorted out I'm sure. Keep moving forward! >I'm thinking of using cloth pieces to build torso musculature in layers Is this for simulating volumetric bulk of her 'musculature'? If so, then I'm personally very-inclined to anon's suggestion about utilizing LARPfoam (>>28173) for such uses. It's certainly what I'm planning for the exterior substrates in my own designs. Cheers. >>28173 Very cute Aigis arts! :^)
Open file (696.24 KB 891x663 smine2.png)
Open file (1.01 MB 897x669 smine1.png)
Open file (444.23 KB 493x657 sat.png)
Open file (92.85 KB 1277x1022 balancing code.png)
>>28197 >le eva foam I mentioned EVA foam earlier ITT, problem is eva foam doesn't really stretch. Adding a second joint in the spine to assist with future balancing (using a gy-521). In the 2nd revision of these parts I got the tolerances right for the joint, but the lower anchor was perpendicular to where it would attach to the waist so I'm re-printing the yello bit. Spud got tired of standing so is now sitting on the workbench :3 Need make the head less egg-shaped if I want to properly fit the wig on there. Also sharing a basic, 1 axis balancing code you can do yourself in python. Can't remember if the error handling works or if I added too much but ¯\_( ͡° ͜ʖ ͡°)_/¯
Open file (1.10 MB 1402x662 newspine.png)
Open file (581.51 KB 477x661 funnywig.png)
New lower spine joint is finally complete, just need to wait for a new ASMC-04B servo to arrive since the current one is the test unit I accidentally bricked. Then it is time for the2nd spine transplant. Thankfully it is only a partial one (I made some notes so you have an idea how it will fit... Once it is installed probably won't notice unless you look real close lol). Wig was too smol, that's what I get for a $10 Spamazon special lol. Jaw panel sits differently that planned so I might need to re-print that, and re-print the face with some more mounting holes since it is currently only being held on by a single forehead screw and pressure from the ears against the head frame. Hehe looks kinda chibi/big-headed in the pic lol.
>>28239 >>28353 Excellent progress Mechnomancer! Looking forward to seeing your redesigns in action. Cheers. :^)
Open file (5.24 MB 302x540 Carrysmol.mp4)
Since parts are not arriving until at least the 18th, I dug around in my archives and found some footage of the first robowaifu. Here is the Workshop Waifu "Carry" in the early stages, and was in fact my first project using a raspberry pi. I find the contrast between Carry and Spud striking, it is amazing they are only about a year apart :D
>>28450 > I find the contrast between Carry and Spud striking, it is amazing they are only about a year apart :D Amazing. You've come far in a short time, Mechnomancer. Good job! Actually, I rather like Carry. I particularly appreciate the fact you try to keep costs relatively-low. (>>28446) Thanks for these videos Anon, please post others you think would be helpful to beginner-anons here. Cheers. :^)
Open file (2.83 MB 302x540 Carry2.mp4)
>>28456 Ok, more Carry-posting :D Started making 'er a little taller in this vid, filmed about a week later.
Open file (953.51 KB 831x517 wigz.png)
Open file (353.17 KB 451x587 face_cloth.png)
Shipment arrived, threw the wigs together. Head looks a wee bit flat but I'm printing some skull panels (forehead first) that should help the shape. Also stretchy, pale cloth arrived for the face. The acrylic one in the background didn't turn out very well.
>>28601 Why exactly the need for fabric-skin? Is it just to hide the mechanical side of "emotive" elements?
>>28603 For a more seamless face: more natural eyelids (flexible) and to cover up the terrifying, fnaf-esque jaw :D
>>28601 >>28609 Thanks for the updates, Mechnomancer. Of course, do as you see fit, but I would highly suggest to you that you don't get overly hung-up on creating any appealing face r/n. I think your talents lie more in the mechanical side of things & I would recommend you focus more on those aspects of SPUD. In particular, I say you should consider making a serious effort this year at making her walk freely in a bipedal fashion. During this same era, there will be other anons focusing on the head/face, and then perhaps we can move forward all together by splitting the work up this way. >tl;dr I simply don't want you getting your 'truck stuck in the mud' over this area, and losing your thus-far-quite-excellent forward momentum. Cheers. :^)
Open file (289.54 KB 497x242 backup.png)
>>28611 I appreciate the concern, but I'd like SOME improvement over the past year, especially since SPUD will be one of my flagship items during my demo circuit. I'll just be slapping the new cloth on there like the old cloth face, with the addition of them cloth eyelids. Maybe I'll put some leds in the cheek for blush XD Forgot to mention the final ASMC (20v, chonky) servo did arrive so I can do the spine transplant, the result will be the ribcage being used to help balance spud on both axis. Combined with the arrival of properly rated buck converters I should be able to get all the motors running off an internal battery (40v 2.5AH lithium Ion). And since there is enough resistance in the motor's gearbox Spud can almost stand by itself UNPOWERED, little energy will be used while idle. However if I can't get bipedal motion working I do have an alternative (see attached)
>>28615 >especially since SPUD will be one of my flagship items during my demo circuit. Oh yeah. That makes sense then, I'd proceed too. Just friendly advice. :^) > ... so I can do the spine transplant ... Nice! That's exciting that you're going to soon have an improved spine for her. Looking forward to that! >(see attached) Ahh, the Tachikoma trick then ehh? :D I'm planning something similar for the smol headpat-daughteru Sumomo-chan. She'll probably only be a couple feet tall at most, and her having a little 'car' to ride around in will be both cute & helpful for the both of us!
Open file (1.10 MB 815x647 metal hair.png)
>>28626 Here's a vid demonstrating the spooder leg mechanism. I'm rather tempted to build a 3.5 foot rc spider (with a nerf gun turret on top) anyway, so making the robowaifu ride it wouldn't be too hard... theoretically. https://www.youtube.com/watch?v=wypThTfbclM >daughteru tbh I think calling it a "mecha musume" would work just as well. :) I had to take Spud's bottom half apart to properly align all the heavy motors flanges (in the legs and spine), so at a 1250ms servopulse they're roughly at a standing position (don't want any wacky hijinks when I start coding the movements). Have to redesign and reprint (again) the lower spine anchor as it was slightly too large for the hips to rotate freely (currently printing). I installed Spud's forehead plate and put the hair on, and improved the way the wigs sit greatly. Yeah 2 wigs. The lower brighter yellow was originally a hair extension built into a baseball hat (looking like an owo danny davito, should've gotten a pic for the lolz but I was in the zone), and the 2nd is a gold pixie-cut style wig. Slightly different colors but very fluffy: looks like a refugee from an 80s cyberpunk animu.
>>28627 >teh spooder leg mechanism Very cool linkage! It looks like you'd need a pretty low coefficient of friction between the pads and the surface for it to be able to pivot properly though? >(don't want any wacky hijinks when I start coding the movements) This. I like that you are not only willing and able to rework/rework/rework -- but that you apparently expect it going in. This is the key to success as highlighted by great men such as Henry Ford, et al. Just.don't.quit. This is the key to (eventual) success!! :^) >looks like a refugee from an 80s cyberpunk animu. Lel'd >=== -fmt edit
Edited last time by Chobitsu on 01/20/2024 (Sat) 06:14:58.
Open file (5.36 MB 856x480 Chonky Servo Test.mp4)
>>28631 >willing and able to rework/rework/rework -- but that you apparently expect it "How do you make god laugh? You tell him your plans" :D I'll post my bucket of 3d prints that didn't make the cut at some point. Made a vid of the chonky (180kg) servo being controlled by the robowaifu computer (raspberry pi). 5V servo control board with 15V going to the motor itself, both using buck converters from a 40V atlas tool batter I got on clearance. Once the spine piece gets re-printed (again) I'll be connecting 'em all up.
>>28639 Looks good! Really looking forward to SPUD's advances this year, Mechnomancer. Cheers. :^)
Open file (6.44 MB 270x480 spudmotortestlores.mp4)
Test of the robowaifu head/abdomen motors without any motion filtering. I'm going to be replacing a joint in the neck so the head is a bit more balanced, then do another calibration of the leg motors via the robowaifu-puter before I attach Spuds spine to them. Kinda wobbly cuz its only attached by 2 screws via pvc.
>>28645 That hair is totally going to get caught up in something.
>>28648 it's already obscuring the webcam in the chobit ear lol
>>28645 Nice, the servos seem to be well-rated for moving the mass around pretty quickly. I'm sure once you have the new spine mechanisms in place, Spud will be more stable. Cheers.
>>28653 Oh shiznit nice, I didn't even notice it. Good placement.
Open file (4.74 MB 1080x1920 meeme.mp4)
Well, first full face test was less than ideal. Droopy eyelids give Spud the appearance of a stroke victim D: . To be fair I did it in like 15 minutes, 30 tops. Might revise the mechanism to something like didney do https://www.youtube.com/watch?v=YRDBFc-TrtM Wouldn't require much, just remove the cloth from the sliding eyelid panel and attach it inside the circumference of the eye socket. What else to do: - A little bit of red/pink paint in the cheeks/lips to remove that corpse-like uniformity. - Corners of the mouth are perfect for sewing in a little wire anchor to be pulled on by a servo to move between a smile and frown. To save on servo channels I could have it synched with the eyebrows? idk - Darker eyebrows so they're visible thru the hair. - Felt eyelashes (instead of floppy thin fabric ones) - Look into a design to eliminate gap between face surface and lcd screens further cuz I'm always looking for a challenge. - Calibrate the rest of the chonky servos - Mount the gyroscope (Did a test directly linking it to the abdomen servos for the lulz, I wouldn't recommend it unless you want a robot flopping around like a fish) - Wider mouth? What I've done since last post: - Calibrated the left arm (adding wrist rotation that you can't really see in the video atm) and ankle joints, and documenting servopulse position into the servo testing library. I'm going to need a third servohat to control all the motors. Spud is going to be so flexible lol. - Stuffed a little padding in the face to make the cheeks nice full and round -and pinchable, if that's your thing. - Increased shoulder width (with cool orange pieces!)
>>28717 Impressive progress.
>>28717 Not sure how much effort would you put into designing a waifu mouth movement. Just thinking about it on how to intricately create a functioning mouth muscle movement and tounge with such depth and detail is just insane. I think this has to be one of the greatest challanges in developing a realistic machanical facial expression.
>>28743 I'm certainly going to do this. We discussed this very often in other threads. This could use - solenoids pulling on strings which are attached to the skin or some "tissue" underneath. - magnetic switches doing something like that - magnetic sliders moving the attached skin a little bit sideways - or e.g. solenoids pushing the some skin in the moth region forward to tipping her lips for kissing - small air compressor or vacuum pumps? - some variant of (electric) soft muscle
Turns out, Hamachi doesn't work for Raspberry pi beyond 32-bit OS: one of its supporting daemons is out-of-date (referring to a system file that doesn't exist, which I spent several hours tracking down). For now I'm using ZeroTier to set up a virtual LAN for demonstration purposes. You can easily host them on a single network without any third parties, but I plan for the avatar and server to be separated by significant distance. I've set up Speech to text protocol on the PI and it runs surprisingly well, now I have to attempt to rebuild the voice engine sending protocol as I had to change from the zmq library to the socket library. Also documenting the entire process if not for others, at least for my own sanity when I inevitably touch some system file I shouldn't have and bjork the pi's sd card
>>28912 My man almost set up his robowaifu network on the poor man's Minecraft 'server' app
>>28912 Looking forward to seeing what you come up with for this, Mechnomancer. >Also documenting the entire process if not for others, at least for my own sanity when I inevitably touch some system file I shouldn't have and bjork the pi's sd card Very good thinking, Anon! Cheers. :^)
>>28913 Only for when I take SPUD on the road :)
Open file (1.96 MB 779x1341 spudlegs.png)
Open file (1.16 MB 548x1255 spudlegdetails.png)
Woke up early this morning, couldn't get back to sleep so I wired up SPUD's legs. I figure I might be able to get away with syncing the hip rotation servos to the ankle rotation servos (at least for now). This would effectively reduce the # of servo channels required down to 3: 1 channel to lean from side to side, 1 to move the left leg forward/back and the other to move the right leg forward and back, very similar to the femisapien. An extra wire is required for the ASMC servo power, however they don't need a ground as they are apparently grounded on the servo signal wire. The legs stand even when unpowered, however there is a slight tendency to fall backwards so I might end up making the feet even bigger, and either have a pair of false feet sitting on some big platforms, or just some megaman-esque feet/shins. There's certainly gonna be some shenanigans when I try to get it to walk for the first time, I'll probably hold it up with a rope or something like they do for all them early versions of walking robots you see on youtube and stuff.
>>29134 There are numerous studies on human locomotion & kinematics that show that average human walking gaits are in fact an accumulation of hundreds of subtle & not-so-subtle muscle movements + plus a coordinated inverted-pendulum 'mass throw' (as in 'controlled-falling'). It takes us years to mature to the stage where our musculo-skeletal/nervous systems master this orchestration with aplomb. That we can even seriously begin to pin everything down to the degree that we can honestly begin anticipating accomplishing this by robo-DIY'rs today is nothing short of amazing IMO! :^) Good luck, Mechnomancer! >=== -sp, minor edit
Edited last time by Chobitsu on 02/05/2024 (Mon) 16:40:53.
Open file (5.83 MB 480x854 Leg Dance.mp4)
>>29135 One smol shuffle for robot, one giant leap for robowaifu-kind. The ankles are on the wrong way and there's a lot of joint slop, but its progress. Especially since they're all being controlled on 1 servo signal lol.
>>29134 >I might end up making the feet even bigger permanent skis here we GO
>>29134 Very nice, Spud's on her way! :^) I would recommend you take about a 10Kg barbell weight affixed over her centerline, mounted atop an actuator-controlled vertical lever (thus your inverted 'pendulum'), and with that you can begin to coordinate the needed counter-balances. This will enable you to have her lift a foot completely up off the ground as she's sliding it forward. (If you want to get an innate sense of what's needed for the proper dynamics here, simply take a broom and balance it upside-down -- handle tip resting on your finger tips -- then walk forward like that.) Loosely suspend her first with overhead cords as you suggested! :^) >=== -minor edit
Edited last time by Chobitsu on 02/05/2024 (Mon) 16:54:46.
Open file (121.49 KB 502x869 Femisapien-a10.JPG)
>>29140 Scale up the femisapien to 5'5 and the feet are like 12 " long (or something, its been a while since I did the maths). I could always just move the ankle axis further forward instead of making 'em longer. We'll see. Not like my workshop floor is exactly level lmao.
Open file (3.96 MB 320x570 Body tilt.mp4)
>>29143 Already got torso tilting. I have a gy521 gyroscope installed, too. Just need to deploy my balancing codr.
>>29145 Yeah, very nice. It's obviously desirable to use the actual dynamical masses of the real system for prototyping, rather than a dummy mass. Also, don't neglect the mass of the head as it's own 'inverted pendulum' for the upper torso. Not sure what the mass there is, but a real human head/neck is at least 1 stone average. You can consider it's dynamics kinda like the trim-tabs on an aircraft, or the tip of the tail of a cheetah racing across the savannah. Nice progress, Anon! >=== -prose edit
Edited last time by Chobitsu on 02/05/2024 (Mon) 18:27:48.
>>29145 time to walk like i shat myself i read that humans walk by throwing themselves off balance at the start of each step thats why robots have that 'i shat myself' walk because theyre doing the exact opposite and keeping balance for the entire process
>>29134 >I might end up making the feet even bigger Well, I hope you changed your mind >>29145 >>29147 Yeah, that's how I understand it as well. We need to make robots that can fall without breaking, then we won't need to care. I realize more and more that many if not most plastics parts should be made out of TPU: >>28944
>>29147 Well I'm using the femisapien's walking mechanism as reference and its walk cycle doesnt seem so... shitty. lol https://www.youtube.com/watch?v=rNezxoSEEB0
>>29145 Very nice! where did you install the gy512? do you use PID for balancing?
>>29136 Spud is gonna have moving legs too? This is ambitious and very interesting! So cool to see someone else actually building a life-size robowaifu! Hopefully she can be set free one day! (Like M3GAN in that funny movie).
Open file (136.28 KB 291x201 gyro mount.png)
Open file (272.57 KB 467x436 broken ankle.png)
>>29151 >PID control Honestly, I never did find calculus particularly useful (although some of the supporting hardware/software I'm using most likely uses it behind the scenes). I'll certainly be using the concept of PID... but without the goofy maths. Gyroscope is currently just hanging out of Spuds neck (you can see it jiggling in the vid), but I'll probably mount it in the trapezeus or something. Maybe use the accelerometer in it to make a shoving/balancing video with Spud going "hey" and "stop it" >>29152 I gotta aim for the hardest thing. That's my mojo. If the legs are a success then I'll scale it up for power armor and maybe even a bipedal, riding mount (a la Iron Monger), however that isn't particularly robowaifu so I'll leave it at that :D If 2 legs don't work I'll make spud a robo-spider throne to ride. During some tests I forgot to keep the main 20v powerline separate from the servo signal line. It picked up the noise from the servos current draw when they start/change positions. It gets a messy signal, making the motors spazz more, which makes more noise, creating a feedback loop of messiness making me glad I had an emergency cutoff. However during the adventure the ankle joint over-extended, breaking it. Servo noise seems to be my kryptonite whether in mech projects or robowaifus.
>>29145 What do you think about adding some rubber strings between hip and the torso? These would help pull back the body towards the center.
>>29165 Depends how much slop is in the abdominal joints. Seem to be doing ok, since in the vid I'm using a manual servo testing board to move the chunky 180kg servos. I got a pack of little bungee cords I was using before I added the ab joint. Probably need to 3d print a knee linkage to replace the pvc one due to joint slop...
Open file (129.70 KB 1500x1330 71G-u-8xiHL._AC_SL1500_.jpg)
I succumbed to temptation and got one of these. Not only are they neat, but it could be a good temporary face mask while I work out the more realistic face. And allow for more (literal) flexibility for eye placement. It's expensive tho. Thank goodness for overtime pay!
>>29185 > spends WAY too much, in the autistic pursuit of basic progress... Heh, a man after my own heart. :D
>>29193 Well I've heard about these screens for years and I've always wanted one so this gives me an excuse :D Did a thigh test and isolated the 20v power cable from the servo signal lines. Had no problems with signal noise, just that dang joint slop in the left knee. Definitely gotta replace that with a 3d printed joint. The servos might actually be fast enuff that I could a walk cycle that doesn't balance the robot on 1 foot all the time (like in previous posts here walking being a partial falling sorta thing). If I put some padding between Spud's frame and the morphsuit it could help her survive falls.. and make her huggable I guess.
Open file (107.12 KB 474x474 ClipboardImage.png)
Open file (179.11 KB 474x474 ClipboardImage.png)
Open file (195.48 KB 474x474 ClipboardImage.png)
>>29206 Consider designing it around some kind of door hinge. Some might need constraining from the side, but this could also help to make the leg easier to remove if necessary. I would really keep an eye on standard parts out of metal. If necessary, the printed parts holding these can be adjusted later to make it work for other people, especially if these parts are just small and simple themselves. Not everything has to be 3D printed, just because it can be.
Open file (861.92 KB 591x559 roboelbows.png)
>>29207 Will probably do something similar as I did with the elbow (but not double jointed): held together with a machine screw and nylon lock nut. Easy to replace: I swapped out one of the parts and it took like 5 minutes tops.
>>29208 Ah, okay. This might be the better way. And in the knee it would even be less of a problem if it's a bit on the bulky side. Elbows should really not be too big.
>>29185 Very cool piece of tech! How much did it cost? how are you going to connect it?
Open file (226.32 KB 1500x1329 81qb27AydwL._AC_SL1500_.jpg)
>>29212 Cost 350 bucks O_O So its my only purchase until next payday It has some boards to connect it to HDMI.
Open file (5.56 MB 2109x1389 kneu knee.png)
New knee, who dis?
>>29215 How do you want to use it? are you going to put it into SPUDs face or do you want to replace the face with this flexible screen? I find the idea to debug her through her face very interersting assuming you are going to use the hdmi output of the raspberry
Open file (1.21 MB 300x189 bth.gif)
Open file (9.31 MB 1364x768 faces.mp4)
Put together a dynamic emotion library. How it works: first the script creates a list of all files in the emotion directory (each .png file has a filename describing it's emotional state). Then the script scans the input text (which will eventually come from the AI but for now is manual input) for all possible possible words, and displays the file named with the last word detected. and-the important bit- only looks for words that already have files associated with them. So adding a new expression is easy as adding a .png file to the proper directory, named for the desired emotion. In future this system could also be integrated for animations (see first post), where Spud will even physically move according to the words the AI says. (so if the LLM says it is dancing, the actual robot will dance a pre-programmed dance)
>>29243 nice could do some kind of interpolation to mix those expressions, not pixel interpolation though guessing that would just give weird monstrosities
>>29243 Thanks, that's a good idea and I love how such a small thing can already be very intriguing. I contemplated something like this a few times after my more ambitious idea of extracting this from an AI generated video failed >>26098. Though, I thought more about doing this for each syllable to make speech and singing animations. I just thought it was too difficult and time consuming for me, especially since I have no experience in drawing. But something like this might be in particular interesting for an AI girlfriend app. Then again, I still have a hard time to believe that this doesn't already exist somewhere. I generally don't understand why animation isn't more automatized using standard patterns which are then maybe adjusted to some specific character.
>>29245 >animation isn't more automatized It is. A lot of modern cartoons are made similar to old cell style 'sections', where there is a body on one layer, and then any part that will move is on another layer. Bodies are broken down into a full front shot, profile, 3/5ths, arms, legs, head, and even the face are completely separate, and all these pieces are placed in a library or database to be used to create specific animations, while things like walk cycles, sitting, standing, can be reused in whole chunks like 3d modelling rigged animations can be transferred between models. Some digital animators can't actually draw at all and will never have to because they place cartoons together like it's done in south park.
>>29243 Cute!!
>>29250 I always thought so, but this seems to be some expensive software or some company internal work-flows. If it was available as some open source software or even just in some common animation program, then imo we would see much more animated videos.
>>29261 We see a lot less of this the more we head into completely digital works. Actually as cool and fast as digital art is, cartooning is actually losing some techniques. >>29258 >Much more animated videos You do, it's just so many of them are terrible. Youtube animation has leapt from dedicated amateurs to kids as young as 8 with their own channels full of terribly drawn yet very smooth animations. Toonboom and a couple of other programs released free versions of their software a few years back and now a lot of studio quality animation floats around on youtube.
>>29263 my point is digital motion tweening a la flash looks cheap af. frame-by-frame is more visually appealing, especially with smears and whatnot. however sometimes digital tweening works for certain elements like the shading in "Klaus". It would be more complicated to implement digital tweening using opencv rather than simply create some in-between frames by hand. but if one wants a cartoony effect, anime characters do tend to change facial expressions quickly
>>29243 Is this written in python? So at the moment you use one png per emotion, those faces in the video look like they are build of parts (eyes + nose + mouth etc). where did you get these faces from? do you have a link? I would play around a little bit, try to construct them from parts.
>>29266 yup python-based. some nerds complain how it is slow and xyz and I pissed 'em off my saying "consumers don't care about backend fiddly stuff they just want product" Opencv is a little weird when it comes to combining images, which is something I have to look into. Because it would be nice to have a "layer" for eyebrows, eyes, cheeks, mouth, etc. Found a higher-res version of the faces lel I've been fiddling with my local AI, however it needs some coaxing to get it to emote anything so it can do facial expressions. Half-tempted to also check what it says to look for emotional keywords too. Or I could spent $300 on another graphics card and use a higher tier model.
>>29268 In my opinion it's way faster to get shit done in python. And that's what matters most in a small team project. What I really like about python is the low entry barrier. Just download and start scripting. So you are using opencv for drawing .pngs? Thanks for the faces!
>>29268 rendering layers is pretty trivial look into opengl instead or the compositor of your video driver if you want to be autistic
Open file (18.54 KB 363x87 solution.png)
I decided to brute-force my way to a solution, going through every single opencv function that seemed relevant. With all the folks trying to make HUDs for webcams a la ironman I'm surprised I had to actually troll through documentation rather than some nerd going "hey look!". Using cv2.bitwise_and(image1,image2) to create a compiled image seems to work, as it literally adds each pixel value together. Neat! For some reason everyone tries to use "add weighted" or variations thereof. However, interpreting emotional data in a multidimensional way (eyebrows, eye shape,mouth shape, face icons like blush sweatdrop or forehead veins) will require some more thought and experimentation. This will especially be more difficult using a less sophisticated AI model.
>>29263 >Toonboom Thanks, noted. >>29272 Thanks, this could be very useful. For robots with screen faces and virtual waifus as well.
>>29272 Nice work. I'm glad that you're focusing somewhat on performance in your search for 'the right tool for the job' (another honorable & necessary goal). OpenCV's engine is effectively written in pure C++ (there may be a very smol amount of some inline ASM code for specific drivers, etc.) The Python serves as a frontend API to this backend system. This is quite similar to the architectures used with TensorFlow and PyTorch, for example (and the approach we also plan for the users of robowaifus, as well). The >tl;dr is: use the scripting interface for most jobs, since that is good enough. OTOH, if you ever need to 'open the hood' and tweak the system's basic characteristics and/or performance itself (and you can't find already-scripted means you can assemble together into precisely what you need), then you'll need to drop down into the underlying systems languages to accomplish that (C & C++). Thankfully, because all of these engines are opensource systems (and permissive ones also [Apache, BSD]), you can do exactly that -- and just as you see fit! :^) (In /robowaifu/ 's case, we haven't even finished building the waifu engines themselves yet; so the Pythonic APIs will come later on, as well.) >tl;dr Kinda like if you wanted to automate using Blender, then you'd use it's Python API for that. If, on the other hand, you wanted to change Blender itself, you'd use C++ or C to do that. --- >However, interpreting emotional data in a multidimensional way (eyebrows, eye shape,mouth shape, face icons like blush sweatdrop or forehead veins) will require some more thought and experimentation. Abstraction. Write a little wrapper library for yourself that has all the high-level elements you need (happy, sad, etc.)... and ways to properly blend them together. Use that abstraction layer in your code you interact with the AI through. That way, regardless how either system changes -- on either side of the 'barrier' -- you've got one, go-to place to deal with that underlying complexity properly. --- Good luck Mechnomancer (and everyone else working towards this particular set of goals ITT). Looking forward to seeing all your results! Cheers. :^) >=== -fmt, prose edit
Edited last time by Chobitsu on 02/10/2024 (Sat) 12:15:34.
Open file (61.46 KB 680x393 199324749.jpeg)
>>29272 >a la ironman
Open file (4.37 MB 404x720 pepper'sghoose.mp4)
>>29276 I ended up using the "pepper's ghost" optical illusion, but now I could revisit it but that is not really relevant to making a robowaifu, unless you wanna make a virtual waifu living in a "holotank". It's pretty much just reflecting an LCD onto a piece of plastic at an angle.
>>29279 >unless you wanna make a virtual waifu living in a "holotank" We actually have an entire thread for precisely this topic, our Visual Waifus thread : >>240).
Open file (6.00 MB 480x586 bendyscreen.mp4)
screen arrived
Open file (389.96 KB 1023x555 tex.9.png)
>>29272 Don’t judge me too hard for my suspiciously in-depth knowledge on this, but one of the best places to find animation/character sheets (like picrel) is from VNs. Specifically, Emote-engine stuff, games by Yuzusoft or SMEE, anything advertising “Live2D” or the like If memory serves, the best-qualities sheets I’ve found were from - Nekopara - Meteor World Actor - Maitetsu - Nobleworks (partial— fully assembled faces like you’re already using) GARBro can extract most VNs these days, and even from traditional (non-animated) VNs, you can decomposed faces with add/sub and/neg/xor operations in GIMP or the like. Not sure if links to the high seas are allowed, but you can probably grab any VN you forgot your license code (happens to the best of us, amiright?) to from ryuugames or whatnot (just use a tampermonkey/greasemonkey link bypass script so you don’t have to fill out some retarded survey)
>>29279 I have to say that I am really overall impressed with your robotics and applied engineering, just really really good work there.
Open file (1.11 MB 1973x1305 better solution.png)
>>29289 Hmmm I'll have to take a look, vtuber assets might work, too. I found a better solution that will allow for overlaying png files using their alpha channel. It is a bit clunky as I extract/convert the alpha channel to a separate alpha map,then load the image again for the diffuse, but it works. And it can be iterated on so multiple pngs can be overlaid, however so far the addition of each layer needs to be a file, not an image modified by the script outside of the function. To get iris movement probably would require 2 masks: one for the eyelash layer (so it can be on a custom skintone face), and then another to crop the iris layer.
>>29295 Does your troll picture have a proper alpha channel? If so it should be accessible as fourth value in the list (rgba). OpenCV seems to load it if you use the IMREAD_UNCHANGED param. You could seperate the image loading from the overlay part and just iterate the overlay part over a list of images. So that you add each layer with with each iteration. This way you can do something like: paths = [path1, path2, path3] for each img in range(0,len(paths)): if img==0: result = load(paths[img]) break result = overlay(load(paths[img]),result) some pseudo python, I can't code shit without IDE
Open file (1.21 MB 924x520 facetest.mp4)
>>29297 I just ended up repeating the code as many times as I needed for each layer instead of using a function or a loop because I had to make some slight tweaks in each iteration. Created a basic engine where after a bit the face blinks and I open the mouth with spacebar. a few more buttons switch between the 2 current emotions, the naming scheme you can see to the right (as well as up to 12 different skin colors in case you want to switch between a tan waifu, pale waifu or custom colors/textures). Should probably stick to fairly lo-res images because of all the maths. IDK I don't feel like doing performance tests. I can use elements from work on my previous dynamic emotion library to detect the available emotions based on whether the png files exist in the directory rather than coding it in manually, so adding on new emotional expression is easy as making the graphics and naming them properly. Now I just need to connect it to oobabooga and get oobabooga to act sane through the API.
>>29288 Remarkable. It will be very interesting seeing your design for installation of it. Good luck, Anon. --- Very nice progress with the emotions research! >=== -add'l resp -minor edit
Edited last time by Chobitsu on 02/12/2024 (Mon) 06:08:13.
Open file (5.44 MB 668x480 bendy owo.mp4)
>>29366 Here it is. There are some performance issues because the screen is 1920x1440 and I'm using an older raspi setup(32 bit) but some tweaks/threading can smooth it out. Also the left side of the screen is getting pinched D:
>>29403 Very nice! >screen is getting pinched It seems to me you'll need a slight redesign, and extend the frame bevel all the way across (ie, down, in this orientation).
>>29404 The top and bottom frame are connected by an independent piece (held on by the screws you see in the vid), so I just printed some new pieces 3mm longer and no more pinch. >I am pinching an lcd screen The future is weird, man.
>>29403 Are you loading your images on startup or before you display them? Maybe caching could reduce the performance impact.
>>29419 I was loading them during every iteration of the loop. However the performance issue lies in the adding of the 5 different images, not the loading process. A 200x200 pixel face results in something like half a million pixel calculations every iteration. Not a big deal for desktop computers but there is considerable performance drop for the PI. It also wouldn't be that big a deal except the flexible oled is only capable of a resolution of 1920x1440 (Higher res than my 30" monitor), and even when scaling it up the 200p image after all the calculations there is still performance drop. So I did end up implementing a sort of cache :) To get a bit into how it works: the face status is assigned with a function showface() with variables eyes,nose,mouth,base,skintone. These are assigned with variables that redirect to the image filepath based on adding strings together elsewhere outside the showface function. So changing variable emotion from "smile" to "pout" changes an entire set (a non-available emotion assigned will result in the face giving you a death stare until you fix it and an error printed on the console. I can upgrade it to an audio-visual cue/better error handling at some point). Example: mouth = str(eld+str(emotion)+str("_mopen.png")) where eld is "hires_eye_expressions_crop/" (the directory relative to the script) and emotion is the base of the filename. This results in a change from "hires_eye_expressions_crop/smile_mopen.png" to "hires_eye_expressions_crop/pout_mopen.png" When a new face is calculated the image is stored in a "faceprev" variable and the filenames are stored in "current status" variable. This current status is then compared to the assigned statuses in the next iteration of the loop, and if they are the same it loads the image from faceprev instead of calculating a new one. If they are different, it calculates a new one (and a new faceprev). Perhaps a little performance lag during the change but some threading can keep it a little separate from other functions like object detection and motor movement. I also had to divide the eyebrow/nose layer by 255 because it was getting blown out for some reason, artifacts around the eyes during the darker skintone is something on the png not the script ¯\_(ツ)_/¯ Sorry for wall o text, I like talking about it ᕙ(▀̿̿Ĺ̯̿̿▀̿ ̿) ᕗ
>>29420 welcome to software rendering lol, check if the library is actually using simd instructions otherwise yeah its going to be like 16x slower, dont know what cpu is on a pi i think its an arm cortex which has simd maybe you need to set some parameter to make it vectorise
>>29420 >the flexible oled is only capable of a resolution of 1920x1440 Is there really no way to lower the resolution you're outputting to it? I don't think I've ever encountered a monitor that won't display at lower resolutions.
>>29424 It's actually very common for Chinese screens to either have a terrible scaler or to lack one all together. Doubly so for low quantity bespoke solutions. As for this project, glad to see Mechnomancer integrating well within this board and creating an interesting facial architecture. One which should leverage the graphics processing power of his Pi more.
>>29420 This looks really good, Mechnomancer. I have to admit (against my better judgement), that it would be compelling for normals if you fashioned as sort of faux A*ple Vision Pro -like border rim around your screen. Kind of like a full-face mask. The curve actually does quite well at luring you into a perceptive 'suspension of disbelief', and I think the mask motif might quickly send most non-autists into an alternate robowaifu reality if Spud was geared up so. >>29421 This. IIRC, the Pi systems rely on a non-standard Broadcom phone chipset for graphics. I'm unaware if any GPU-specific APIs have been reversed for it yet, and even if so, whether the OpenCV library would have driver support for it. Regardless, vectorisation would help tremendously. His older chipset might not support it, but I would imagine the newer (RPi4 - up) might. Great advice, Anon. Thanks. >>29427 >As for this project, glad to see Mechnomancer integrating well within this board and creating an interesting facial architecture. Yes he's doing a great job here.
>>29436 if its not supported you could always use a lower colour depth 32bit is overkill when theres only a few colours used you could use a bmp with indexed colours instead of png then you only have like 4bits per pixel, much faster to process, i dont know how opencv works there has to be some kind of colour to alpha function to set a specific colour as a transparency like the magenta you see in oldschool bitmaps
Open file (874.82 KB 960x540 henlotheretest.mp4)
>>29424 It theoretically can do 640x480 according to the pi, but the screen refuses to do that resolution. >>29436 Its a pi 4 with 32-bit OS (4gb). Definitely gonna make the screen holder more pretty, I just needed something quick because I don't wanna risk breaking it. That single screen alone cost more than all the heavy-duty lift motors in my crab mech! An advantage of a full-screen face is that when it is off the robot doesn't look possessed, unlike having a screen behind a faceplate without physical eyelids. >>29449 In order to do the overlaying process I actually convert the alpha in the png to an alpha map and use that to add/subtract parts of the images from each other. If I didn't extract alphas from the pngs I'd double the # of required files. I managed to put together a software-based mouthflap protocol. While the audio is playing I open the raw wav file as a list of frames, I record the length of each iteration of the loop, check the amplitude relative to the loop length (if the loop is 50ms long, it check the 50x12 frame as there are 12 wav frames per ms. if I was on the 1200th frame it would then check the 1600th frame. Just 1.26 seconds has over 15,000 frames!) I'm checking individual frames, a very small fraction of a sound (1/12th of 1ms!) so it's kinda amazing it even relatively synchs up, and since each iteration length is slightly different, I get different lip synch results each time. Sometimes the mouth hangs open (not in the video)and it is adorable. I could minimize this by doing averages of groups of frames but I'm done coding for the day.
Just a quick update: loaded face script onto 64-bit rasbian, seems to be running twice as fast lmao.
Another smol update: managed to piece together a script to put all the raw face image data into a variable (aka buffer) in the script rather than loading from disk for every iteration. I don't know if there will be any performance increase yet as I haven't yet deployed it on the pi. Also got the script installed that creates a list of possible emotes based on the emotion graphic files, then scrapes a string to see if the emote exists and sets the expression accordingly. Is it clunky? probably. Does it work? yes.
>>29500 probably, read speed speeds is io,ram,cache,reg ordered from slowest to fastest, could be worse if python uses some weird alignment and your end up with cache thrashing doesnt your pi have a 64 bit cpu, check with lscpu the pi 4 specs says it is, dont know why you would use a 32bit os then
>>29501 Well my "cache" is just a python list containing the picture data in the form of arrays. Between my wonky cache and better upscaling somehow magically appearing, performance on the pi has drastically improved. See video. The loading screen isn't just for looks: it more or less reflects actual loading progress after opencv is imported. Even progress of the raw picture data being found/loaded into the list/cache and the emotion list being generated is represented in the loading bar. I have the pi communicating with a server script on my main computer that handles voice generation and will eventually communicate with oobabooga (other AI options are available), and of course program takes the voice recognition data and only looks for the emotion keywords based on the picture library and loads the set accordingly.
>>29504 yeah youre loading the pics to ram which is faster than reading from the disk, i say cache when i mean the cpu cache (ram is slow for the cpu so if you read from ram it reads an entire chunk into the cache because statically your going to ask for something nearby), you could use a profiler to see how many cache mises youre getting but its just autism at this point it looks fine, good work
Hey Mechnomancer, do you have a particular brand of servos that you'd recommend? I'm looking to pick up 5 or 10 of em for fixture prototyping over the next few months Non-continuous (up to 180deg on some of em), low torque (couple N*m at most), decent-ish speed (90deg/s?) btw, OpenGL ES is pretty well supported on the broadcom mystery-meat graphics chips, the free version of QT uses it and has a pretty straightforward canvas API too, if memory serves. Should be able to keep all the textures loaded and just swap out layers at runtime (50 bytes of VBO instead of an entire texture), so no delays and a sync'd framerate even during animation. >>29510 valgrind's cachegrind (and callgrind, to a lesser extent) would get you the benchmarks this anon mentioned, but I'd go for the hardware acceleration (GL ES) approach first
>>29522 I just get whatever Servos I can. I tend to use the MG995 or the ETMall MG996R for low-torque applications(6x for $20), while GoolRC has 70kg ones and Injora has 35K ones. All of these are the same size/form factor, but Injora comes with brand stickers you can put on your project. I get 'em off spamazon. I'm just aiming for the minimum viable waifu easy as possible. If y'all wanna do more fancy stuff like openGL be my guest but that ain't my jam, yknow? :) Got 2 of the modular systems done: adding more voice commands (and the scripts that run as the result) and emotions are easy as making the files and naming them properly. A few extra features like a dedicated simon says protocol, a placeholder for when it will connect to the LLM, seeing what the bot sees. The weather forecast API appears to be down at the moment. Speech recognition has to be a separate program from the face display as it doesn't like to play nice with it Threading. Possibly some alternatives exist but I kinda like have 2 programs as I can adjust one without having to restart the other. They communicate by putting data in a text file. Might be a little slow but there is lag from the TTS engine anyway so another fraction of a second doesn't matter much. Making this modular has required so many loops it has driven me loopy ᕕ( ͡ʘڡ ͡ʘ)ᕗ
>>29449 >when theres only a few colours used you could use a bmp with indexed colours Very good advice. >>29451 >>29466 >>29500 >>29504 >>29602 Wow! That's a whirlwind of updates and a rather nice outcome to it all. GG, Mechnomancer! It's gonna be fun watching Spud come together. Cheers. :^) >>29522 >btw, OpenGL ES is pretty well supported on the broadcom mystery-meat graphics chips, the free version of QT uses it and has a pretty straightforward canvas API too, if memory serves. Excellent. That's really good to know, Anon. Thanks! :^)
>>29504 When I tried something similar, I started contemplating using GIFs for each sequence I need. Hoping this would cut down on the memory costs and loading time. The idea was that somehow the file format would have an compression algorithm, so it would only update in each moment what's necessary.
Open file (523.49 KB 3120x4208 SPUD_screenface.jpg)
>>29623 Not sure if you're aware, but just in case (and for the other anons out there) from my experience, gif compression works by reducing colors/dithering, and optimization utilizes transparent pixels to to only display the difference between frames. Anywho, did a quick test fit of the screen face, and just as I feared: I need to redesign the head because the screen juts out too far.
>>29666 I might have briefly read about this, but didn't remember. Thanks. The face looks nice, and a slightly bigger head will look better anyways.
Open file (1.06 MB 582x1446 spudsuit.png)
Waiting for the new head pieces to print so I dressed SPUD in the morphsuit for the lulz. I definitely need to upgrade from the paper panels. Maybe even a 2 morphsuits with padding sammiched in the middle would not only improve aesthetics but also protect against damage in case of a fall.
>>29672 Wow, this is maybe the best looking irl gynoid I have seen so far.
>>29672 the karen screen face is more my thing
Open file (35.81 KB 731x564 filez.png)
>>29673 Technically not a gynoid, as it doesn't have -nor do I plan for- any gynecological bits. After all, this particular unit is initially meant to be an artificial booth babe. :P When I do release SPUD I don't care what mods people wish to add. What a man does with his robot is none of my concern. >>29676 Well, I am thinking about customization in future, you can do your own custom face whether owo, realistic, iconographic or karen easy as making/editing the files :)
>>29677 >Technically not a gynoid, as it doesn't have -nor do I plan for- any gynecological bits. Yeah, this could be a point, but to me it's just that she's meant to be female. Most androids also don't have the male parts for reproduction.
Open file (9.08 MB 3201x1389 ouchie.png)
>>29681 I just figure gynoid should refer to the type of female android that has the fun bits ;) New head is complete, but SPUD got an owie in the process :'(
>>29683 Wait, is the screen broken?
>>29685 its got a cluster of dead pixels :( Mental note: get screen protector
>>29686 Damn, and that was a new screen too.
>>29683 >>29686 Great progress actually, Mechnomancer. Please don't be too bummed about the boo-boo on Spud's face. You can kind of think of it like EtR Lulu -- it gives her a charming dignity of having overcome obstacles! Just like you do, my friend! Keep. Moving. Forward. Cheers, Anon. :^) >=== -minor edit
Edited last time by Chobitsu on 02/20/2024 (Tue) 09:01:29.
>>29683 Can't you still return it on spamazon? I would totally just send it back, tell em it was broken as you started it.
Open file (1.26 MB 740x1062 beveragebottom.png)
Open file (922.27 KB 320x570 Tippity Tappity Test.mp4)
I found a beverage dispenser with a bottom almost the exact same size circumference as the current screen holder. So I carved a piece out of it and slapped the screen behind it. It even has a lip on one side to help hold the screen in! Talk about providence, eh? Now to 3d print a better chin.
>>29724 Neat! Very inventive appropriation Mechnomancer. Today I'm constantly on the lookout for such things, since a major goal here is to allow impoverished men to also build their own robowaifus. "One man's trash is another man's treasure", they tell me. Cheers. :^)
Open file (220.90 KB 307x467 chin.png)
>>29744 No sense reinventing the wheel :) I think the wig looks awful, so I've ordered a different on: a single wig instead of 2 combined together. Also installed the new chin, and will start work on re-printing some of the parts so they make a better shape under the morphsuit.
>>29752 The wig looks quite nice. But I might be a weirdo.
>>29752 And you'll be able to dress her up as a pirate for Halloween!
Open file (410.15 KB 559x465 headz.png)
>>29753 >>29753 It might be my a̶u̶t̶i̶s̶m̶ artistic eye for perfectionism kicking in, but the top wig (crown) is actually rather small (even the reviews said so but I didn't realize how smol it really was), so it has the appearance of the hair being plastered to the head rather than big and volumous (see the red scribbles I did in the pic). I think it has to do with the screen tbh. Even so, new wig will be arriving in a few weeks (its one of a relatively esoteric character so no amazon prime turbo shipping). In the meantime I'll be trying out one of the other face "skins" I made that has a lower face silhouette on it. Just got done putting together a jukebox script, too. Need to put files in a specific folder and have the artist name in the mp3 metadata, but the script can more or less find songs by saying the artist name, play all the songs by the specified artist in a random order without repeating, and the ability to skip tracks by saying "next". I could do other metadata too like song titles and albums but I'll stick with the basics for now. I also found a nifty script that allows for a confidence interval of similar words, so if the speech recognition AI model thinks you say "whether forecast" instead of "weather forecast" I can add in that bit of fudge factor to make things easier.
>>29760 Since you've already ordered another wig, it might be a good idea to try styling this one as an experiment. Basic stuff, like curling, cutting, the like. That way you'll be better equipped to fix any issues that might happen with the wig you ordered.
Open file (234.82 KB 487x240 faces.png)
>>29761 I moved the sides further up and tried out a different face skin and compared the two. Only problem is now the hair isn't secured at all lmao
Open file (81.41 KB 541x454 1692831980580923.jpg)
>>29786 The right side looks super-cool now, Mechnomancer. Kinda like she's wearing a space/mech helmet! :) >yfw punished Spud
Thinking about it, I might increace simplicity of SPUD's face assets further, like the attached, until I can perfect the physical face. Also, thanks to this screen face revision I should be able to swap between face types with relative ease. I think I can whip up a quick flat-screen-face module for anons who can't (or won't) shell out the $400 for a bendy screen.
Open file (7.04 MB 320x570 Jukeboxtest.mp4)
Jukebox is 95% complete, just need to add a pre-recorded voice prompt for when it misunderstands a command >_<
>>29907 <im sorry dave
>>29907 Nice progress, Mechnomancer. >>29909 Kek.
>>29907 >sorry, anon, you don't have the lincense to listen to that song.
>>29922 Actually, I do. Stringstorm is an independent music composer on youtube who music for "If The Emperor Had a Text-To-Speech Device" (plus other original songs) and occasionally has his entire discography up for free on Christmas Day :)
Printing off some aesthetic panels for SPUD, starting in the arms. Full range of motion is preserved.
>>29940 Those should fill out the morphsuit nicely.
Open file (263.48 KB 299x422 glasses.png)
Open file (2.17 MB 920x1442 armpanelz.png)
I got a brainwave for the physical face: lcds disguised as glasses! Also the arm panels just look so nice I can't stand them being covered up by the morphsuit. Going for some Alita vibes lol.
>>29959 Arms do look really good, Mechnomancer. I'll be interested to see your results for the glasses. BTW did one of Spud's 'Chii ears' break, or maybe you have it open for service? Looks like it's drooping open. Good luck, Anon! Cheers. :^)
>>29959 That's a solid idea. I saw a video of a girl building a Clank for her cosplay project, and I liked how she did the eyes. https://www.youtube.com/watch?v=xIcRPAMU7oc Custom animations behind resin printed, smoothed and polished lenses.
Open file (274.42 KB 515x515 backside.png)
Open file (886.51 KB 759x1073 front.png)
Open file (899.48 KB 615x1073 hipback.png)
Open file (1.36 MB 805x1070 frontdown.png)
Open file (867.86 KB 795x1075 front_3_4.png)
Been spending the past few days printing parts for SPUD's hip panels. Total of 10 separate panels to keep down the amount of filament for supports, each panel taking about 8 hours to print (thankfully I have 2 printers). They will definitely need some standing >_<; Also printed some new curved rib pieces. Need to adjust some of the pvc in the legs before I can get her standing again, but I did find my mech scale while cleaning the mechworks so I should be able to get an official weight soon. Next bit of coding I just need to deploy is a) having voice-command scripts being able to tell the main script to play audio without requiring the TTS server, eg playing a .wav file of the SPUD saying "oops" or "try again" and the mouth flaps syncing to it, b) copying over the weather forecast script with the cache (the weatherforecast service I use doesn't like more than like 10 API requests in a day lol) and c) mapping the servos (again) and integrating the old animation script.
>>30060 That is really excellent-looking progress, Mechnomancer. Looking forward to seeing Spud fired up and chatting away with her newest look! Cheers. :^)
>>30060 whoa boy, whoa mama
>>30061 Integrating the LLM is a little ways off: I still have to figure out how to get the oobabooga AI to act sane through the API. I think it has something to do with the history formatting not working properly so I might have to implement my own solution. That will probably be just some list manipulation, which compared to the modular stuff I implemented (scraping directories and formatting filenames into voice commands and then detecting them by comparing every word) this should be a piece of cake. Custom history would work by adding each response to a list with a label prefix (user:/AI:) and deleting the oldest response to ensure it doesn't exceed the AI's limit. Maybe have a second set of history that I can ask the AI to summarize into a sentence or two (if that takes a while I can have spud say "Hold on, I'm thinking" or something if the user tries to talk to her while this is going on). Then reintegrate the memory retrieval protocol and inject the user preferences into the AI prompt so it might one day just ask why the user likes "Samuel Adams" or whatever. Also the wig arrived today, but I have to install a new forehead plate cuz without it the bangs are shaped all funny.
Since you have dead pixels are you still set on using that same monitor for a full face? I was thinking color eink display would be better since it wont glow in your eyes which has a potentially more immersive effect and likely the lower frame rate of these mind of monitors aren't as big a deal when doing an anime styled face. Eink can be flexible too. Though that might be costly for a large screen and difficult to find so you could go back to closer to your original design where you have individual monitors for the eyes but instead leave room for eyebrows for more expression to also be in the monitors possibly and you can use a third monitor for the mouth. Though I am uncertain if that would be any cheaper than a single large monitor. In addition to that you can use some sort of air pump that pumps of small balloons under the cheeks so SPUD's cheeks can puff up when smiling or pouting, assuming you use an animegao kigurumi or a silicone moulded doll face instead of the pepakura or stiff looking plastic you were using before. That should be quieter than the noisy actuators you were using before and have a more humanoid feel. But I'm just a wanderer who stumbled on this board so don't mind me much.
>>30082 >But I'm just a wanderer who stumbled on this board so don't mind me much. Hello Anon, welcome! Please have a good look around the board while you're here. If you'd like to, please introduce yourself in our Embassy thread : (>>2823) . Great ideas about the face! BTW, we have a couple of related-threads here on these topics: Robo Face (>>9), and Robot Vision (>>97) . Please browse through those (and dozens of other good) threads to see if anything piques your interests. Thanks for stopping by and contributing, Anon! Cheers. :^)
>>30081 If you want some instructions on how to do this with GPT4ALl instead, I have that python code already..
>>30090 Thanks to you I looked up GPT4all and found some python code, I'll be sure to experiment with it once the model downloads. >>30082 Flexi-screen face is a temporary solution until I make a more articulated physical face (also an excuse for me to buy one to play with) :) Physical face will have servos in the cheeks to pull the mouth into a smile, so that should add a bit of life to it. Took a leaf from your book and checked out some e-ink screens: I could get a pair with color for relatively cheap (cheaper than the flex-screen, anyway). I'll have to see how long the screen displays a pic while unpowered, as that could eliminate the need for physical eyelids.
>>30090 If you are experienced, there is also https://sillytavernai.com/
Open file (24.78 KB 982x630 gptforall code.png)
>>30093 From what I can tell, Sillytavern is more of a front-end that can interface with various AIs including (but not limited to) chatgpt and oobabooga :) I did figure out some code for GPT4all for all so I've attached it (so easy!). Biggest problem was figuring out why the AI was pretending to be the user (it just does that lol) and how to stop it (stop the text gen if parts of "USER" are in the token). Also created a list for user responses and ai responses that can be more easily read out, and randomized temperature to get a bit more variation in responses Some of its responses remind me of chatgpt before it was nerfed.
Open file (635.61 KB 545x1065 bellybutton.png)
Open file (451.19 KB 799x855 newig_face.png)
Open file (1.39 MB 1525x519 capz.png)
Added an ab plate and some caps on the hip side motors, all of these press fit onto the chonky side motors. Going to build the thigh panels out of EVA foam so when SPUD sits they don't break. Screen face looks a bit odd with the wig, maybe if I added a bit of a fade on the top of it or something to make the transition to the edge of the screen less harsh... Still, looks kinda nice. Now need to deploy GPT4all into the server script. Unfortunately (or fortunately, as the case may be) the mistra-7b-instruct model emotes only through ASCII emojis which act a little funny in Python IDLE but fine in notepad. But at least the format it uses the emotes is consistent enough to develop a protocol... the fun part will be making the graphics for 38 different emotions. At least I got a good base to start with :)
>>30125 Very nice! I like the 'belly boop' plate & the new wig, Mechnomancer. :^) Some observations: * I think the cap is great, and it looks aesthetic. I'm concerned that it may trap too much heat in the actuator housing and overheat the motor? * The ears are a good initial design, and very cute. Now that you are homing-in on the face proportions however, they seem rather oversized to my eye. If you undertake a redesign for them at some point, I'd suggest you make them a bit more petite, more in keeping with Spud's cute feminine characteristics. * I think you have a good idea about the 'forehead fade' for her face illumination. You might also consider making that area more rounded as well, in keeping with your graphical design of the shape of her jaw/chin. * If you do go with color e-ink, you can probably place illuminating LEDs embedded within the bezel of her 'helmet' mask -- ala as commonly used in filmmaking for astronaut movies, etc. * I think you've very close to nailed the general proportions of the face to the rest of the upper body. This is easy to miss, but your pic #1 really shows this clearly as working well already. She's already come a long way in a short time, Anon. Great progress! Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 03/05/2024 (Tue) 05:34:50.
Open file (131.68 KB 660x1624 spud3dmodel.png)
>>30126 The caps only cover the tip of the 550 motors on the servo, about 1/4 its length: there is also enough resistance in the gearbox that SPUD nearly stands *unpowered* so those motors won't be subject to much overheating unless SPUD is dancing non-stop for like 20 minutes or something. Even then, I could probably slip some heatsinks in there. The ears have to be their current size in order to hold the USB hub/hdmi cable combo. Can't make them smaller unless I get a smaller usb hub in there (having 4 ports is nice tho). I could make a headband sort of thing for where the screen meets the forehead, but this face design has already proven its point, and with a few more tweaks it should suffice until I re-do the articulated cloth face with the e-ink eyes. I designed SPUD's body pretty much in 1 single file and made sure every part looks nice together (with an exception of the PVC linkages in the limbs, which I eyeballed)
>>30127 I understand. Please understand Anon that when I'm making a constructive critique of someone else's project work, I'm not in any way denigrating that work, nor belittling the creator of it. Quite the opposite. I'm simply bringing an objective set of eyes to the problem, and if I make such a comment I'm literally trying to help 'plus' the work. After all, we're all in this together! >I designed SPUD's body pretty much in 1 single file and made sure every part looks nice together That it does, Mechnomancer! Looking forward to your progress. :^)
>>30131 Hehe I understand. I'm just responding to your constructive criticism. :) Were not for posting here I'd have no idea about things like gpt4all. It is always nice to get feedback from folks working in the same subject. In my experience the biggest negative nancies are the folks who haven't built anything, hence have no idea the effort/requirements it takes and thus probably don't have anything of value to contribute anyway. I haven't experienced it from folks here, as I have been on the receiving end of genuine butthurt elsewhere enough times to accurately identify it. There are so many design choice one can take that unless you just dive in and do something, one can be paralyzed just from the sheer number of potential solutions.
>>30132 Thanks for your understanding, and for your mature professionalism Mechnomancer. :^) >There are so many design choice one can take that unless you just dive in and do something, one can be paralyzed just from the sheer number of potential solutions. This. It's hard for me to conceive of any other design & engineering undertaking apart from creating DIY robowaifus that has so many 'paralysis by analysis' opportunities! :D
>>30092 >Physical face will have servos in the cheeks to pull the mouth into a smile, so that should add a bit of life to it. Taking the hard route then. So many humanoid bots wind up looking off or even creepy as can be. It will take a lot more than just the four cables in your earlier drawing to make more convincing expressions I would think but it depends if you're combining this with other methods. >Took a leaf from your book and checked out some e-ink screens: I could get a pair with color for relatively cheap (cheaper than the flex-screen, anyway). Good to hear something I said is going to be used by someone. >I'll have to see how long the screen displays a pic while unpowered, as that could eliminate the need for physical eyelids. E-ink displays last a long time if not indefinitely without power. I have seen them on grocery store shelves with no visible power source as labels. At least that is the case when stationary. I am not sure what happens if they are exposed to vibrations, changing lighting conditions and EMFs.
Open file (959.28 KB 647x1091 facefade.png)
>>30133 >It's hard for me to conceive of any other design & engineering undertaking apart from creating DIY robowaifus that has so many 'paralysis by analysis' opportunities! :D Mechs and Powerarmor is another undertaking rife with such paralyzing opportunities. Since SPUD has GIMP installed adding gradient to the face texture took like 5 seconds :D
>>30146 >Mechs and Powerarmor is another undertaking rife with such paralyzing opportunities. I can only imagine. And human-safety is an even more-present concern in that case, AFAICT. >Since SPUD has GIMP installed adding gradient to the face texture took like 5 seconds :D That is indeed an improvement to Spud's look, IMO. Perhaps you could burn (lighten) around her eyebrows a bit as well? I think that would make her face 'pop' out of the mask a bit better. Cheers. :^)
Tested adding an additional expression to SPUD's library, and it went flawlessly. Oh yeah, also integrated the LLM into the server script (as well as a timeout). Having a reaction to one's voice (letting you know she is thinking about it) makes a big difference.
Open file (243.31 KB 329x457 fatface.png)
increased face width by 22 pixels, looks much better imo
Open file (752.87 KB 1000x565 The_Shootist_John_Wayne.png)
>>30161 Nice! >Having a reaction to one's voice (letting you know she is thinking about it) makes a big difference. Agreed. Her mouth seems a bit like a frown to my eye, rather than a "hmm, let me think about this.." Maybe a half-frown? >also <Red Dead Redemption Heh :D https://www.youtube.com/watch?v=9UIRoW13gOw >>30175 Having a full-screen face certainly offers many easy-to-change opportunities for nice variations!
Open file (18.36 KB 179x150 expression.png)
>>30178 Well this is what the face looks like when not obscured by hair :)
>>30182 Very cute design. You made a good choice here for Spud, Mechnomancer. :^)
Open file (675.43 KB 682x384 randeyetest.mp4)
Spend the past 12 hours or so slamming my head against a new addition: movable eyes! It is easy to set up as naming the proper files! If iris files (and supporting alpha files) exist for an emotion file set, it will use the files to make the irises move according to a coordinate set. If the files are not found it will display the static eye file (good for icon-like eyes). Graphics are a wee bit rough but it is something only I'd probably notice anyway Eye coordinates are a variable to be set like any other. where negative x/yvalues go left/up and positive xy values go right/down. (just don't exceed 10 in any direction tho). Currently have them set to move during the blink to random positions but other options are possible. I have them on the main computer, just need to transfer over to SPUD's avatar.
>>30224 Tthat's good news, Mechnomancer. I consider it a key metric that a project is both fundamentally sound, and on the right course, if things get easier and easier in the system, and the path forward with it gets clearer and clearer as time goes on. Please keep up the great work! Cheers. :^)
>>30226 A little copypasta and the code was easily deployed onto the avatar. Even added a little bit where the eyes might not blink while the eyes change direction. I could do a little tweak externalizing the AI object recognition so I can get smooth eye movement, but that only runs while SPUD's eyes are closed and eyes rarely move smoothly anyway. Might try to make Spud track any people she sees, idk. I also put in a bit of code so when the pi detects an external monitor it automagically adjusts the face to the proper-ish orientation.
>>30227 Facial animation is the subtlest of all animation. Good luck, Anon! :^)
>>30227 now for the shoulder mounted rocket launcher, or just a gimbal for the head is fine i guess, i assume its not that different from what your doing with the eyes, my telescope has a computerized one and those things are heavy so should work with a head, you could reuse the same code just subtract the object from the current position to make it center on the object
>>30231 As Chobitsu mentioned earlier >>26388, this place really isn't meant for discussing the construction of waifu-terminators. Make waifu-wuv not war.
Open file (17.74 KB 1069x702 Untitled.png)
>>30240 war is the pinnacle of love could still do that with the head its the same thing, a gimbal is just two motors perpendicular to each other
Open file (164.14 KB 1500x1500 71YNUuUeFuL._SL1500_.jpg)
>>30230 One of the best ways to avoid uncanny valley is to go iconographic. Through simplification (and avoiding looking like a real human) the brain fills in the gaps with what it wants rather than having to deal with the inaccuracies that exist. Even so, I ordered some 2-part casting silicone off Amazon so I can experiment with molds/face making. Cuz, yknow: its difficult. But will make it easier on myself by going for the f̶e̶l̶i̶n̶e̶ anime look However, many robots out there just seem to put a rubber mask over the face mechanisms. Instead, I'm going to be attaching cables and such directly to the face to move it. Will probably end up making a whole new head but hey that's how design goes. >>30257 Spud's neck already has motors (as does all her joints) I just don't have them on right now -buck converters have yet to be installed into the shell and the servo channels properly mapped. Object recognition AI returns screen coordinates of the people it detects so isn't that difficult to turn it into movement (I did that before the overhaul to the screenface, can't seem to find the footage tho).
Open file (2.81 MB 2101x1049 ankle.png)
Forgot to get mold release for the silicone. Oops! Anyway, finally redesigned the shin to be more stable - gonna end up with some femisapien/megaman-esque shins I guess - and assembled one (top 3). It is much sturdier than the old one (middle bottom) since it has a rotational axis on both sides. Once the second ankle is complete SPUD might be able to stand on her own two feet 𝘸𝘩𝘪𝘭𝘦 𝘶𝘯𝘱𝘰𝘸𝘦𝘳𝘦𝘥 ! Also going to start work re-designing the head for the eventual silicone face.
Open file (1.27 MB 541x1461 unpowered spud.png)
As I suspected: not only can SPUD stand with the new ankles, but can do so 𝘸𝘩𝘪𝘭𝘦 𝘶𝘯𝘱𝘰𝘸𝘦𝘳𝘦𝘥 .
>>30405 That's a good posture lol Are you going to round the feet out? Or at least cut off the corners for a sort of elongated octagonal or hexagonal shape. Would look better than rectangles.
>>30407 After I get Spud walking on those planks I'll be making the soles swappable. That way I can quicky/easily be able to see what sort of foot shapes I can get away with.
>>30408 You're planning to make SPUD wobble forward kinda like a penguin, did I understand everything right?
>>30424 Well the plan is to get 'er to walk kinda like a femisapien: https://youtu.be/1UQJoVSl0Mw?t=8 Not sure how wobbly or penguin-like you would consider that.
>>30425 To get that stable if you copied that walking style youd need the ankles to tilt side to side. Notice how femisapien steps on the inner side of the food and it causes a near fall over to the side on every step because of that? You need ankle tilt to counter that.
Open file (5.83 MB 480x854 Leg Dance.mp4)
>>30427 The pics here >>30373 are of the ankle motor :) I had to reprint some parts so the joint was more stable. Here is a vid of what it was like before (also ankles were tilting the wrong lol way).
>>30428 I see. Yeah that would do. It seems a lot of sort of side to side leaning walks involve stepping either flat foot or on the outer side edge of foot first. If you could put some sort of mechanism on the torso to keep it more center of gravity you'd get a swaying hips effect possibly if you have any stability issues still arising.
>>30429 Did someone say Torso Twist? Its here :D >>29145 I need to properly mount the gyroscopic sensor in the torso tho.
>>30430 A twisting motion would help some appearances but I meant like so the upper part of the body isnt swaying side to side because the tallest parts could sway a fair amount potentially throwing off balance. Though you may have thought of that before.
Open file (2.28 MB 1533x1081 neckpistons.png)
Open file (370.17 KB 817x583 JBweld balljoints.png)
Open file (84.76 KB 743x539 neck fit.png)
Been having good weather so I've been doing non-robowaifu robotics over the past few days P: Since SPUD's head is just about only held up while the servos are powered (and the servos heat up a bit while doing so), I figure since I'm eventually re-doing the head I should re-do the neck so it provides a bit more support. S̶t̶o̶l̶e̶ modified some inmoov parts, but 3d printed ball joints on this scale don't work very well. As a result, I'm just JB welding some ball joints (originally meant for a lawn tractor steering arm) onto the 3d printed screw drive sleeve. Won't be pretty but I can cover it up with a 3d printed shroud or something. Two screwdrive servos will be roughly imitating neck tendons with a third servo on top to twist the head left/right.
>>30510 Looks good, but I don't understand what part you welded to what.
>>30512 This. I need to secure the servos and find the optimal height to get the maximum tilt range.
Open file (487.64 KB 734x528 kamichan idea.png)
With spring(ish) weather comes more time spent on my heavy-duty robotics, so not much robowaifu progress (unless your robowaifu needs a 1350lb servo). When it gets a bit warmer I'll be trying out the silicone I bought. Sometimes stepping back from a project and working on a second project can lead to ideas about the first project: I figured a good compromise for the next iteration of the face would be a hybrid of a physical face and screen face: make the eyes physically motorized like kami-chan here, but cut out holes for the iris and stick a smol lcd screen behind them. That way even when the screens are off and eyes are open it still looks kinda cartoony and cute. I did get some full-color e-ink screens for a laugh, however it takes 30 seconds of screen flashing to change a picture. Not very good for robowaifus. Good for nametags and signs tho.
Super-stoked seeing all the great progress continuing, Mechnomancer. Also, the new balljoints look really cool. >>30405 >but can do so 𝘸𝘩𝘪𝘭𝘦 𝘶𝘯𝘱𝘰𝘸𝘦𝘳𝘦𝘥 . Perfect!
Open file (558.87 KB 680x460 neck1.png)
Open file (386.56 KB 680x458 neck2.png)
Open file (223.13 KB 346x470 neck3.png)
Hello, its been a while. Since springtime has arrived I've been busy with other stuff, but I finally managed to get around to installing the new neck. Need to re-print some pieces though before I can do a powered test but I got a general idea for proportions. Neck seems a bit long so when I redesign the skull I'll try to shift the entire face downwards. Thinking about going for a more robot-y head (a la medabots or drossel) to avoid the uncanny valley while I work on a more intricate (perhaps rubber) face. The head is attached on by 3 screws so swapping heads will be relatively easy (might need to make a custom connector for the i2c and face servos or just put a 3rd servo board in the head). Remember, SPUD wuvs you ^3^
>>31022 Hi Mechnomancer! Very glad to hear from you again and all that's been happening with SPUD. The neck looks nice! I think it's a very good idea at this point in time (and likely so for the next decade) for all of us to not try to make faces look too realistic, but instead make our waifus 'robot-y'. Good call. Cheers, Anon. :^)
Open file (8.21 MB 180x320 robot looky-loo.gif)
Open file (1.45 MB 991x1053 big brain idea.png)
Some may wonder "Why change the neck mechanism? It works!" This is true, it does work. However, the direct drive in the neck joints means there is stress on the servos just to maintain position. And while it works for a short time, I didn't feel comfortable with any long-term tests as the servos heated up considerably during the short tests (bear in mind I plan for public exhibition of the 'bot for several hours). With the new neck mechanism, the screw drive more or less locks it in place so energy is only used while the servos move. I got another shipment of printing filaments, so soon I'll be starting on the design on a medabot-style/simplified robo face. The mood board is for your amusement as much as it is for my own utility.
>>31041 It's deffo good thinking to keep your robowaifu's energy consumption to a minimum, especially during idle times.
Open file (49.84 KB 496x510 IMG_3987.jpeg)
I have been on hiatus, I last browsed robowaifu in January, I was really frustrated with the board. I’m happy this was the first thread I saw on this visit. Following SPUD progress was and is pure joy! Im currently reading about Soar and cognitive architecture and your thread is keeping me motivated, thank you for sharing SPUD! I also absolutely adore the screen face. I think it’s a great way to avoid the uncanny valley while keeping it expressive!
>>31022 Nice
Open file (2.45 MB 2424x1060 trapezeuses.png)
I didn't need to shift the head down, just needed to build up the shoulders/neck according to proper anatomy by adding trapezeus panels (no this doesn't mean SPUD is a trap lol). Next up will be a redesign of the sternum so it extends further upwards and hold some clavicle bits.
Open file (3.73 MB 2388x1064 clavicle closeups.png)
Open file (605.75 KB 520x1044 clavicle overall.png)
Clavicles installed and it makes me smile. Might adjust the trapezeuses (trapezei?) in future. Its the little things that make all the difference.
Open file (545.98 KB 834x338 roboface.png)
New face dropped, waiting for new mounting brackets to finish printing to attach it to the skull.
>>31079 oh haha, cute bushy eyebrows
Open file (393.81 KB 687x440 eyesglo.png)
Open file (585.50 KB 825x566 2 heds.png)
Took me a while to get the eye lcds working again. Apparently the initial driver for the ILI9341 (and the one you find on the net) was using the incorrect Raspberry Pi GPIO library (or it was depreciated), and while I had it running in my other codes I didn't have it implemented in the example! A few hours of banging my head against the wall, followed by a break and then discovered the problem. Won't be making the same mistake again: I overwrote the initial example so now I will always have the working driver. There is something odd going on with the viewing angle with the LCDs in landscape orientation.... a very small horizontal viewing angle. (you can see how dark the far eye screen is in the 2nd pic) All well I will have to re-design the face to have the eye screens in portrait mode. Well, you know what they say... 2 heads are better than 1!
Got the servo control board wiring more compact and a little motion going along with the wandering eye program. Need to re-balance the head though because the neck tendon servos can't move it very well..
NICE. This is a remarkable project already, Mechnomancer. I'm looking forward to seeing again what you accomplish in another month or two! See you then & cheers, Anon. :^)
Open file (453.22 KB 1854x1050 neckv2_overall.jpg)
Open file (309.38 KB 1584x1070 neckv2_closeups.jpg)
Overhauled the neck by shifting the mounting points for the neck actuators forward by 35cm. A combination of that and using the on-board controller (I burned out the servo testing board, I don't think it can provide enough current) allows the head to be tilted side/side and up/down without putting too much stress on the servos. Unfortunately during the testing while I burned out the servo testing board the left neck piston snapped and I'm out of JB weld -that is what I get for only 10% infill. Slap some jb weld on there, let it cure then I can add a head wander program like I had with the previous neck mechanism. Or maybe print some better parts that don't have giant globs of JB weld on them lol. I also printed a new forehead plate that is 25mm longer, and shifted the wig up a bit so the hair looks less emo. Before she goes to any conventions I'll be replacing the screen because now the dead pixels are quite visible. Also been cleaning up my outdoor workshop so I can have the space to try making her walk when ready.
Maybe a lens over the LCDs would bend the light to get around viewing angle problem?
>>31163 Let the butthurt flow through you lmao.
>>31167 That anon was just saying the original face is nightmare fuel which isn't wrong.
Open file (5.34 MB 320x570 Basic Gyro Test.mp4)
Turns out the neck servos have the ability to continuously rotate, and will choose the shortest distance to rotate rather than retaining their absolute position. So if it is moving through 180 degrees there is a 50/50 chance of it moving clockwise or counterclockwise. Solution? interpolate values. Even then it will still occasionally freak out so I'll probably end up replacing them with ones that... don't. In the meantime got a basic gyroscope program in the same script as the face code & AI image recognition: might split servos/balancing into a sub-program as each iteration takes upwards of 300ms and the balancing code optimized for the long iterations (in video) isn't as good as the independent program. Can also see that neck slop from the ball joints, need to put a small elastic in the neck to hold it on the screw thread inside the pistons.
Open file (4.77 MB 320x570 Improved Gyro Test.mp4)
>>31193 Now have improved balancing code running at the same time the main program runs (including the AI image recognition for the webcam), and added a feature to prevent the rapid transition between the two. Also, the servoboards & gyro would immediately stop the program if an ant tripped over the cables and they disconnected for a second, so I added a little feature that just has the program wait for the signal to return. For larger tilt values I'll have to figure out a solution to keep it from overshooting.
Open file (404.32 KB 1080x447 pelvis.png)
Open file (1.53 MB 1272x1216 spine.png)
Printed a new housing for the pevlvis/abdominal servo mounts so it is less wobbly, and started work on back panels. Some paper templates for scapula panels and a print-in-place articulated spine that serves no other function than to look nice.
Amazing progress! Do you always prototype with paper?
>>31235 Sometimes I use paper, beats the heck outta waiting for a print only to find out it is the wrong size. Speaking of, I used some paper to make a little guy I call "Sakana Kuchibiru-san" (Mr Fish Lips) based on a mechanism someone posted elsewhere. Replace the paper with stretchy rubber and a few servos to pull into a smile/frown then I got a good mechanism.
Great progress overall. I'm glad to see that.
Manual test of the print-in-place spine. I need to anchor it to the frame more and sand the dumpy to remove those print supports.
>>31290 hard to tell at this angle, is the spine bending more in one direction than in the other?
>>31290 yeah the one between the scapulae needs to be anchored other than that its great
>>31298 Its a little loose because I'm working on adding another joint into the shoulder so SPUD can rotate her arms straight up. The arms keep getting more and more joints lol, Also need some more white panels on the back :3
Open file (172.13 KB 673x1236 bigshouldermotor1.jpg)
Open file (169.73 KB 694x1239 bigshouldermotor2.jpg)
Got the left shoulder with the new chonky servo mount, but it sticks out the back a fair bit so the scapula will end up having little motor caps on them I guess. Can't figure out yet if I can use the deltoid panels or not. Will either have to wait for another shipment of chonky servos to arrive to check it out fully, or borrow the ankle servos.
Open file (517.20 KB 512x768 ClipboardImage.png)
>>31313 Please keep it, your progress is fantastic.
Open file (2.34 MB 630x354 KT19-1868039807.gif)
>>31313 whats the plan to program the arm, is it just preprogrammed positions or dynamic like the robotic arms in factories, no idea how they program those
>>31320 I could do fancy inverse kinematics etc. But at least to start off I'll be re-applying the code I used for one of the videos in the original post (I re-attached in here so you don't have to scroll up. I modified a blender plugin that exports bone positions to .json files to make it export to a simple .txt file with a list for each frame (rudimentary example is the video of the hand fingers wiggling). I have to re-implement the code then get the power converters (5v and 20v) & battery (40v 5AH) installed on the body. And since I'm going to be trying a screen/physical hybrid face anyway, I'm printing an facemask (edgerunners kiwi, *cough*) to check how I could mount a future rubber face over the screen.
Open file (11.51 MB 320x530 robowaifu_wave.gif)
Installed new motors and unpainted facemask. Will probably make new face graphics to take advantage of the extra screen space. Movements are not fluid because I'm using a servo testing board.
>>31384 that is very impressive IMO. Looks so cool great job.
>>31384 Nice! I like the ribcage design - very neat.
Open file (11.88 KB 548x298 newshoulder.png)
Open file (264.05 KB 548x1920 spud got thunderthighs.jpg)
Moved SPUD's ribcage up 3cm so it doesn't seem as long. A subtle change that required printing 3 parts at 4 hours each lol Thighs look extra thicc, which is ok I guess up near the hips but to wide at the knee for my liking, so I'll be figuring out how to slim down the leg linkages, while also adding in another shoulder joint to keep the motors from popping out spuds back so much: perpendicular to the front, then an additional servo to lift the arm distally halfway (see newshoulder.png if you can decipher it lmao).
Open file (4.68 MB 300x168 servo_gear.gif)
Prototype of 2nd shoulder joint: ring gear with a 2:1 reduction.
Open file (4.46 MB 480x854 Shoulder Joint WIP.mp4)
Application of the prototype. There are some tolerance issues such as between the teeth on the ring gear/servo gear and the orientation of the ASMC-04B. Thankfully each iteration of the part only takes 2 hours to print. Since the shoulder is gearing down a 70kg servo 2:1, it should be able to lift the arm no problem. A 70kg servo was was able to raise it 90 degrees with the old rack-and-pinion system, while this new joint only goes 90 degrees with the ASMC rotating the whole kaboodle to give movement over the head. I'm very excited about this part because it just looks so ROBOTIC.
WOW! Great-looking progress while I was away, Mechnomancer. Keep up the good work! Cheers. :^)
Open file (221.03 KB 807x1071 shoulderoverall.jpg)
Open file (434.97 KB 2405x1045 shoulderdetail.jpg)
After an obscene amount of revisions the new shoulders are just about done, I just need to replace a servo in the right shoulder and left elbow because they won't work under the full weight despite being rated for the same torque as the other motors used in the same joints in the opposite arm joints.
>>31570 are you not afraid the hair is going to get stuck in some moving part?
>>31570 Looks great, Mechnomancer. Please keep us posted. Cheers. :^)
post moar
>>31571 That's why she has the updo hairstyle. >>31596 Gonna be trying to get a hand/arm waving before the week is out, and replace the face screen. But first I have to replace the bottom of the ribcage because the servo horn popped out of the 3d print :/
Open file (185.90 KB 789x1024 adjusted_eyegraphics.jpg)
Ok, replaced the bottom of the ribcage with a new 2 part bit for reasons nobody would really care about (easier to swap out the part that previously wore out in this new iteration) and wrote a batch script to adjust all the face graphics to fill up the mask face more. Doesn't look too shabby for 15 minutes of mucking around. Next gotta test the servo control boards and map the servo channels for the arms, calibrate the positions, then find a way to mount the power converters.
Open file (6.21 MB 320x570 Spud Mask Looky.mp4)
Its a small thing: Spud randomly looks around/turns neck while the main program runs. Not sure if I can get a neck tilt in there, but I can try.
>>31605 >>31615 Thanks. Good luck with your efforts, Mechnomancer. :^) >>31616 Excellent. Glad to see you tackling the many & varied, so-called 'secondary (ani)motions' [1]. As with the exceptional sensitivity humans have to facial (ani)motions, we learn to spot this stuff from birth. --- 1. https://www.youtube.com/watch?v=uXHnudwQde0
Open file (288.59 KB 1473x1065 spud legpanels.jpg)
Thinned down the thighs by trimming the bits covering the PVC couplings, and quickly wrapped some foam around the shins. One of the buckets full of outdated SPUD parts lingers nearby :D
>>31625 oh, interesting legs!
>>31625 As with our combined suggestion to SophieDev, I suggest you save these artifacts for the Robowaifu Museum(s) sure to come. Same to every productive Anon here, IMO. Thanks for the update, Anon! Cheers. :^)
>>31628 Unlike my other projects, SPUD's old parts dont take up a lot of room. Got a pic of a full body shot and a video of SPUD looking around. Eventually I have to do something about her giraffe neck, but in the meantime I'm working on some a new joint for the neck piston that will hopefully eliminate some of the slop by using a universal joint instead of a balljoint. I stole it from inmoov, but the thread for the screw only goes a few centimeters into the piston body so I have lots of room to make it shorter. Also it appears the PI will randomly cut power during operation (It is running an image recognition AI and compositing images at the same time after all) so I'll have to run that USB power from elsewhere.
Open file (321.91 KB 1532x1352 spud's arm boo boo.jpg)
Open file (205.30 KB 805x1069 backula.jpg)
Open file (177.37 KB 813x603 toucha spuds spaghet.jpg)
oops, broke the servo horn mount by running one of the servos too fast. That's what I get for only 10% infill. At least I got the servopulse sequence figured out, so all I have to do is link it up to voice recognition. But now I get to reprint the shoulder piece in a lovely orange color at 50% infill :3 I also rigged everything to run off a single AC powerstrip: usb adapter for the pi, usb adapter for the oled, usb adapter for the servos and a 20v adapter for the chonky motors. It's a spaghetti mess but it runs. Some foam scapulas bring the back together. However there's some balancing issues so spuds arms have to be turned forward -scapula's apart- or she will fall back. A good thing to add to a dynamic balancing script.
You arent going to make a plate for knee caps? Looks weird without something that simple.
>>31636 New shoulder joint first ;) Being 50% infill its going to take a while Kneecaps will be a variation of the elbow pads.
Open file (191.32 KB 793x1073 tickle spuds kneee.jpg)
>>31636 There we go: kneepad.
Open file (2.41 MB 480x854 SPUD waves hello.mp4)
Aaaaand phase 1 is complete.
>>31647 A cute! Great work, Mechnomancer. So, is dear SPUD soon to begin a career as a 'robo booth babe', or what? :D This has been great fun to watch you develop this robowaifu, and an encouragement to all the Anons here. Thanks for providing lots of details along the way. Cheers, Anon. :^)
>>31648 She - along with some other projects- are going to be on display at an upcoming local convention. So will be her maiden voyage as a booth babe (last year doesn't count cuz she's more or less the ship of theseus). Once that is over I'm going to be re-printing some parts at a higher infill and re-configuring the animation scripts (movement in the video was hard-coded).
>>31642 That's a bit better though im not sure why you went for a flat plate design and didnt just get some used sports knee and elbow pads to use. >>31647 Cute but she sounds in a bad need of some lubrication. So squeaky. Also is SPUD left handed?
Open file (1.04 MB 1731x2095 sukabu knees 2.jpg)
Open file (612.24 KB 1024x1583 sukabu knees.jpg)
>>31693 Its a robowaifu not a rollerball chassis so no sports equipment. Knees/elbows inspired by some Subaku artwork. Motor noise and squeakiness are to be expected from cheap motors, less noticeable IRL.
>>31734 I just meant it might look better with a subtle curve like the image on the left. Up to you though. Ah yeah true sometimes microphones pick up things we dont really notice with our ears in person.
>>31647 Great. Good work. Did the question ever come up why you don't upload these shorts to some Youtube channel? David Browne got 250k views on one of his videos about Hannah recently. It's not so much about money, especially "shorts" are worthless and can even harm a channel, but it would create more attention. > Stumbled here via youtube rabbit hole This is where I make the most advertisement. I drop alogs/robowaifu from time to time, which works better for copy and search. I also have an avatar with the term /robowaifu/ in it, some people might recognize this as something referring to an image board. Anyways, I'm not doing this all the time, since "self advertisement" is sometimes not welcome and might at least be seen as rude.
>>31747 Funny enough, youtube will simply refuse to show many of my shorts: youtube will usually try to show shorts to 1000 people in the first 24 hours to gauge interest, but anything robowaifu related wont. (Although a few of my other shorts on another project did get nearly 10k views). If I were to guess, David gets the views because of how realistic the face looks in thumbnails. Now that I got SPUD more or less functioning in a rudimentary fashion,the aesthetics are something I'm going to really start focusing on.
>>31760 It might also be the tags? I have a suspicion that David started to use some tag like "Alita" and maybe "animatronics". Many people commenting seem to know about these things.
>>31762 After analyzing the metadata, all his vids seem to use the tags "video, sharing, camera phone, video phone, free, upload". Would have to see analytics to know more. Not trying to diminish his work, but the face movements are kinda rather corpse-like. A-working-on-da-face I go!
>>31764 >...After analyzing the metadata Oh, my bad. Instead of speculating, I should just have looked. I don't see him using any tags. That said, GWAStech2 also has a channel with robots, and seems to get ~350 views, sometimes 800. Though a current one uses a current catchy music which makes it more pleasant to watch (don't look up the other videos using the same song, unless you are mentally hardened by Lovecraft). >... but the face movements are kinda rather corpse-like. I didn't really see that. Me and many others seem not to care. It's also about seeing the potential where this is going.
>>31765 Yeah, I get that. I suppose I should've clarified the expressions (or lack thereof) seemed rather corpse-like to me. Then again, I am very good at reading facial expression. He got a a facial expression to not look uncanny valley, we'll have to see if it can continue :) Just goes to show that aesthetics is everything: so far SPUD's face has been a half-hearted obligation for me as I've been focusing on the hardest parts first: body design and code. However my printers need some new parts (bowden tube melted into the extruder) so I won't be doing any work until the new hotends arrive :( Video tags are not *technically* public, but easy to find if you know what you're doing.
Open file (3.83 MB 320x497 Wasuarts face.gif)
I know wasu arts was mentioned somewhere, so I took a stab at making a 2d animated face with eyes moving on 2 axis and blinking eyelids. I'll automate this and the uncanny valley will be safely avoided. With better art, of course, the face is a quick derpy scribble to check out how it will work.
>>31784 Interesting. Though, unlike a display, the servos moving the elements would make noises. On the other hand, it would an option to cut down on costs.
>>31784 That's impressive. Something so simple, looking that good.
>>31784 this is actually very good. I definitely think leaning into more cartoony/anime styles is the best way to avoid uncanny valley (unless you are not personally satisfied with those styles)
Open file (3.98 MB 320x576 kami-chan face gif.gif)
Open file (3.40 MB 480x864 SPUD servonoise.mp4)
>>31827 A robot without motor noises is like a Reese's cup without peanut butter :3 >>31863 You aint seen nuffin' yet :D >>31888 I think stylization to avoid the uncanny valley is good too, and with this update I am very satisfied. Generated an anime face using Stable diffusion and did a few edits to make it useful, printed it out and pasted it to a 3d printed x/y axis with an additional axis for the eyelid (yet to be powered), then put it on SPUD's old neck: since the head will be much lighter the servos will be under less stress. Probably a good view of the face is 135 degrees or so, any more extreme views can probably be obscured by hair. A gif of the movement, then a video file so you can hear how noisy it is. Next step is to motorize the mouth. Might put a small LCD behind for other expressions besides a mouthflap and eventually slap even tinier LCDs as pupils.
>>31660 Sounds good! Looking forward to a good event report. :^) >>31784 >>31891 Very cool. Papercraft brings a lot to the table for all of us during the prototyping phase. As was pointed out, it's a reasonable approach to ultra-low-cost robowaifu designs as well (>>11446) . I kind of see it as sort of a crossover between papercraft Karakuri and Visual Waifus (cf; >>31835, >>14180, >>240).
Open file (290.58 KB 808x1073 SPUD's new head.jpg)
Did a head transplant, now just gotta clean up the design a bit: probably just add the trapezeus panels back on. Tempted to just make an entire separate face to work out the eyelid/mouth servo integration so I don't mess up what I got here.
>>32033 A cute! If you can afford to do so, I think it's smart to sort of do 'leapfrog'g with revs of your subsystems (like her head). That way, she'll always be in a ready-or-near-ready state. Cheers, Mechnomancer. :^)
Open file (76.37 KB 674x541 face mechanism.jpg)
>>32043 >leap-pepe revs of subsystems I just might do that: The mechanism itself is relatively simple, and attaches to the skull baseplate (yellow) via 2 machine screws. The most expensive part is assembly time: cutting out the eyelashes and gluing the various paper bits onto the 3d printed structure with accuracy. And working with those itty bitty screws on the sg-90s. Still have no clue how to add eyebrows on the dang thing lol. I suppose I could capitalize on the premiering anime by making a face based on Mina from "My Wife Has No Emotion", although that means I'd have to suffer the dorky MC in order to speak with authority about it.
>>32053 >Still have no clue how to add eyebrows on the dang thing lol. Possibly you might consider the simplistic approach used for the SEER head (cf. >>15287, >>31917) as a way to just get started with it? Good luck Anon! Cheers. :^)
Open file (1.39 MB 2537x3492 LuluMightyBrow.jpg)
>>32053 Magnets are always the answer.
Open file (212.76 KB 339x548 eyelid panel shape.png)
>>32054 Its not how to make the eyebrows, it's how to add them: the eyelid panel is literally in the way of putting any servos in there without some sort of linkage. :D
>>32056 Kek. MAGNETS -- HOW DO THEY WORK!!? :D >>32058 Maybe some sort of vertical linear grooves cut out of the eyelid panels, that would allow the eyebrow control shafts to pass through (and the panels would then slide past)? IIRC, the Nandroid designs seem to have similar grooves on their brows (one above each eye; cf. pics-related : >>31769, >>31895) ? I think you could even make the groove cutouts less conspicuous by stationary backing panels printed in the same color as SPUD's face? >=== -minor edit -add crosslinks
Edited last time by Chobitsu on 07/08/2024 (Mon) 00:19:46.
>>32059 I could use some thin linkages to get motion between the eyelids and the face surface, or -I'd have to do a check of the tolerances- just slap the servos between the eyelids & face surface and have the eyelid bend backwards into the head. Got some screenshots/references for Mina. Might make the entire head a simple papercraft ("it's just like in my Japanese animes" lol) Also printed a 2nd face mechanism with all 4 servos (x/y eyes, eyelid & mouthflaps)
>>32130 OK it'll be interesting to see how you solve it all, Mechnomancer. Good luck. >"It's just like in my Chinese cartoons!" Haha. Weeaboo! :DD Mina may be a great choice r/n, yeah. You might be able to gauge any of your booth visitors' power levels if they recognize her anytime soon, heh. Cheers, Anon. :^)
Open file (269.49 KB 1813x701 PAPERCRAFT MINA FACE V1.jpg)
>>32132 >weeb recognition Ye, gotta try to capitalize on the algorithm. Got 10 weeks until the show ends so I got time :D Mina face is ready for printing
>>32059 >MAGNETS -- HOW DO THEY WORK!!? HAHA I wonder if that reference from ICP?
>>32154 Heh, it's Miracles all the way down, Bro! :DD In all seriousness though, as a spiritual & scientific man, I see everything here as a majestic & great symphony of miracles, all orchestrated by the one true and living God; the creator of the heavens and the earth. This of course includes the 4 fundamental forces in physics! :^) >t. just_nerd_things_anon.mp4 >=== -sp, minor edit
Edited last time by Chobitsu on 07/10/2024 (Wed) 20:11:49.
>>32133 Neat! Are you going to do her neck too, Mechnomancer? Mina's neck is kind of a trademark for her, right?
Open file (28.29 KB 449x525 Mina_neckcollar.jpg)
>>32156 I suppose so, after watching the first episode Mina doesn't seem to have any mouth movement, so that makes things easier. A nice papercraft collar would be nice to cover up the new, bright orange neck anchor I installed on SPUD. :) I just quickly stitched together some screenshots to get a rough idea for the collar graphic with the dimensions approximating how it would fit over spud's orange neck anchor. tbh it is probably good enough... idk.
>>32158 >tbh it is probably good enough... idk. FWIW, yeah I think so too. Looking forward to SPUD's newest head reskin! Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 07/11/2024 (Thu) 13:26:11.
Open file (157.13 KB 790x1024 456544777.webm)
>>32164 Lol. Big Girl.
Open file (1.22 MB 1168x1672 ROBOWAIFU BANE.png)
>>32164 >>32170 "You think the owo is your ally? But I was born into it, shaped by it..."
>>32170 UUUU
>>32172 >>32180 >The kawaii catgrils betray you, because they belong to me. I will show you where I have made my robowaifu home, whilst preparing to bring moe justice. Then, I will UWU you.
Open file (147.17 KB 453x989 standing.jpg)
SPUD is currently standing unpowered in the garage workshop. Been working on other, more critical projects.
>>32360 Really cool that you've basically solved the 'unpowered static rest' armature pose issues, Mechnomancer. Hope that your other ventures go well for you. Please share any of your robowaifu-adjacent project work here with us, Anon! Cheers. :^)
Open file (2.04 MB 1591x1063 mina face wip.png)
>>32367 >robowaifu adjacent project work Its boring technical stuff, like re-configuring the mek (which has 1350lb leg motors controlled just like a standard servo, which I recently finished re-calibrating) to have 4 wheel/tread drive and gyroscopic stabilization for the treads (so they don't get pulled off) which involves figuring out how to get an arduino to power a 300lb linear actuator. This could have applications for future heavy -duty robowaifus (a robowafiu capable of carrying 100lbs per arm would be useful), but so far falls under the "Terminators-R-Us", which really isn't what this board is about. But the practice with that does help streamline the robowaifu code. >unpowered static rest It takes a bit to get her positioned in a stable pose, but once she does it takes a bit of a shove to destabilize her. It will take more than a slight breeze will knock her over. I'm going to be working on a tilting work table/stand (a la cybermen) tho so I can easily wheel her around and work on the various bits. Just about everything she requires is physically present I just need the time to work on it. I've started work on the Mina san face (pic related). Looks a little wonky from a side angle and might need to adjust some color settings. Thankfully her mouth doesn't move so that makes things a little easier. If I wanted to I could have a display of several different waifu faces emoting using a simple arduino script, but that is something for future me to worry about.
>>32371 >This could have applications for future heavy -duty robowaifus (a robowafiu capable of carrying 100lbs per arm would be useful) Actually, this falls within the design spec headcanon I maintain for our shared group project MaidCom : (>>29219) . For my own research work for a future low-cost Model A robowaifu project (TBD), in my ultra-low-cost materials casting-about, I've already devised some example ~1ft struts that literally cost less than a quarter US in materials, come in at just a couple ounces weight, and can handily deal with 100lbs+ compressive weight. So yeah. >but so far falls under the "Terminators-R-Us", which really isn't what this board is about. While true, I'm actually thinking about the medical-environment/patient-care aspects. These type of service humanoids likely will need to be able to lift 400lbs patients (the filthy slobs!! :D with delicate precision. So again, yeah. Noice work on the Mina reskin task, Mechnomancer! Good luck with your other endeavors -- 'please be careful. Cheers. :^)
>>32371 For controlling high power loads like heavy duty actuators, please use mosfets or relays depending on duty cycle needs.
Open file (1.12 MB 4208x3120 servoboards.jpg)
Open file (4.49 MB 320x570 Mek Walk cycle.mp4)
>>32376 I have a 2 stage amplifier circuit (see pic) that takes motor signals from standard servo boards and amplifies them to the large actuators via L29N8 motor driver and relays (the L298 motor board only goes up to 24v, and a Maximum of 60V will be pumped into the motors). The motor input runs from 24-48v (with 60v if on the power umbilical and fully charged). Naturally, this current setup is far too large to put into a robowaifu unless I was making some sort of monster mecha musume. More than 24V, the swing motors will overshoot the position and oscillate, so I mitigated this by having a motor controller throttle attached to the 24/48V input that is controlled by the computer. Eventually I plan to have this automated via hardware (modifying servo testing boards). The video is me controlling these heavy duty motors on one leg via a standard servo testing board. Fuses are installed everywhere. >>32372 >Safety Of course I'm safe!
Open file (113.08 KB 1095x730 constanze.jpg)
>>32378 >PWM into Servo Controller, into L298N, into Relays Are you an Ork? This screams dakka for the sake of MOAR DAKKA. It would be easier and more efficient to directly control the relays. Why not use an Arduino relay shield. I do appreciate you connecting all grounds. People underestimate how essential that is towards long term stability. Also, DIN rails are based.
>>32388 >This screams dakka for the sake of MOAR DAKKA. Topkek. As Mechnomancer clearly understands, every problem resisting his Mek developments shall fall by teh mighty Sword of a Thousand Cuts! :D
Open file (1.70 MB 222x275 orkarmsmoll.gif)
Open file (6.51 MB 360x320 mek guitar madmax.gif)
>>32388 >Are you an Ork? Well... yes (pic related: my powerarmor arms). My pronouns are dakka/dakka lmao You see, the servo motor boards are only rated for 6.5V, and the relays require at minimum 10v to trigger (its nigh impossible to find relays that trigger at 5-6V and can handle over 100 watts). More than 6.5V will fry the servo driver ICs (I tried) so I need an intermediate step of the L298N. I could use an arduino, but for the project I'm controlling 20 motors, and an arduino relay shield is only rated for 30VDC (I use 48-60VDC). Making an abomination that, in the end, can easily be controlled is ideal. Nobody said building a walking crab mech (and controlling it with a guitar hero controller) would be easy: indeed she is a moody fickle beastie, but Spud would look adorable sitting on her shoulder.
Open file (1.78 MB 1013x841 SPUD on mek shoop.png)
Ok, I couldn't resist a little shooping...
>>32394 unironically my ideal waifu, just needs treads
>>32391 >>32394 >but Spud would look adorable sitting on her shoulder. LOL. You are one of a kind, Mechnomancer. Cheers. :^)
Open file (7.50 MB 240x240 mektreadgif.gif)
Open file (4.23 MB 480x854 Mech Poweer.mp4)
>>32395 She do have treads: the non-robowaifu project is getting 4 treads on her instead of 2 (see gif), and gyroscopically stabilizing them (using 1 arduino for each set). If the treads are not perpendicular to the ground they get pulled off their runners. The casters are a ball-ache to maneuver the mech with, and require "landing gear" to convert from drive to walk. With 4 treads the mek can just use those as feet to walk, then to drive just position them properly and lift the middle legs and off she go. The cockpit has a cheeky little feature when the power umbilical is unplugged(see video). >>32397 >You are one of a kind, Mechnomancer. I never went to school for engineering so I'm too stupid to know what is impossible... or what "proper" engineering is, for that matter ^_^
Open file (177.79 KB 1500x1500 SolidStateRelays.jpg)
>>32389 >Sword of a Thousand Cuts! As difficult as it has been developing a 1.5 meter bot, I must imagine those swords cut deeper with scale. >>32391 >High activation voltage I'm used to Solid State Relays which usually work from around 3V. They're just MOSFETs and driving/isolation circuitry in a trench coat. Really convenient. >>32394 Gork and Mork blessed this post!
Open file (126.34 KB 764x500 8yervw.jpg)
>>32400 >scale Oh that is a whole can of worms. Lets say you, a human, have an arm 84cm long. You rotate it 22.5 degrees, describing an arc of 33cm. If you extend the arms length to a 164cm (twice the original) length, you only need to rotate the new, longer arm half the degrees to cover the same distance. So even tho the robot covers the same distance with its 2x larger arms, it appears to be moving 2x slower (even if at the same rotational speed as a human). So for your 2x big robot to look like moves at the speed of a human it actually has to cover 66cm in the same time a human covers 33cm. Which means for something like a gundam to move like it does in the show it has to move at least 18x faster than a person: you step 33cm, it has to step 6 meters in the same time! You jump a meter, it has to jump 18! To say nothing of Evangelions and Jaegers (and bears, oh my)! Leverage is something people woefully underestimate, even the Hacksmith with their failure of a spider mech (the idea of which they stole from me off reddit: I brain storm the idea on a robotics subreddit a few years back then lo and behold that's when they claim to have started the project). Compared to working with the mech (600lbs+ of pure "nope :3" energy), SPUD has been easy. :D There's a reason why more folks don't build giant robots: its hard, and Megabots pissed in the cheerios. >relays I would need 4x of those single pole, single throw solid state relays per motor. Starting at $10 each that would be at least $40 per motor. Meanwhile the 2x dpdt automotive relays and an LM298N costs about $7.20 per motor. I'm a cheap sunuvabitch. At least with something the size of SPUD I wouldn't need to use 40+ volts so it would just be the servoboards and an LM298N.
>>32394 This picture reminded me vaguely of the end of "Knights of Badassdom". Though, he didn't have a guitar but a microphone and was on a truck.
Open file (69.99 KB 953x804 multiplexer.jpg)
Open file (103.88 KB 600x407 gate of gyroscope.jpg)
I have learned of the existence of a "multiplexer" for I2C devices. I2C devices require only 4 wires: 2 for power and 2 for data. However they also require unique addresses to access them, which means unless there is something onboard your sensor to change the address you're stuck only using 1 sensor per microcontroller (or whatever device you're connecting it to). Enter the multiplexer: it can set up 8 devices per board with different addresses, and up to 8 boards can be connected if you solder the address jumpers correctly up to 64 I2C devices! Imagine all the gyroscopic sensors a robowaifu could have! Or fun little lcd text displayboards!
>>32583 Neat. As before with the hands, link? I2C seems to hold lots of promise for robowaifu devs & designers. The simple fact they can be daisy-chained out to the extremities seems to be a big plus to my thinking even if the data rates aren't the best. Also, thin, low-mass wiring for connections is also a big win.
>>32402 >I would need 4x of those single pole, single throw solid state relays per motor. Starting at $10 each that would be at least $40 per motor. I suspect you will have to use MOSFET's driven by a MOSFET driver. You can get these for less than $0,50 USD each or less. I see IRF740 MOSFET Transistors IRF740N 10A 400V N-Channel Power MOSFET TO-220 (Pack of 10 Pcs) $7.99 I even see 50Pcs IRF740PBF IRF740 MOSFET N-CH 400V 10A TO-220 NEW T38 for $11.48 The large majority of transistors you need to drive muscles/motors can be these cheap devices. For major muscles you might need something with a little more amperage and more expensive but even those are not super expensive. And you can parallel these cheaper ones. I believe this is what Musk does in Tesla cars. If you look here >>12172 I found some devices on ebay and did some cost accounting I noticed in the next comment you talked about I2C interconnects. These are good but a far better connection would CAN bus. This is what is used in automobiles, medical equipment, factories, etc and it's ubiquitous. The good news is that in the above link I talk about ESP32 micro-controllers. These can be used and have the software built in for I2C and CAN bus. The idea being that the micro-controllers are so cheap you could use them to drive the network AND do computing for your robowaifu all in one. Cavet, you need drivers to drive the MOSFET's. They will not drive properly with just the power from micro-controller pins, or that's my understanding. Not enough voltage swing to fully drive the MOSFET.
Open file (126.12 KB 1024x576 I2CnioiHetairoi.jpg)
>>32583 >I2C Based and practical pilled.
>>32590 >Mosfets Tried 'em on a breadboard, couldn't get 'em to work without frying 'em (from what I could tell, circuits didn't take into account reverseing voltage). Will probably stick with the ASMC-04B, and -if needed- modify them to run my favorite linear actuators. Also can buy just the servo board. >Cancan bus So far, I haven't found any device that would make it necessary. (see >>32593) >>32585 >Neat. As before with the hands, link? Sauce: https://www.amazon.com/gp/product/B0B2VSP488 >>32593 I just realized if each multiplexer can have 8x PCA9685, and I can connect 8 multiplexers I could have 64 servo controllers for a total of 1024 pwm channels (not that I'd ever need that many) I use the SPI bus for controlling smol full-color lcds which can be a pain because some lcds don't have good libraries for python.
BTW, while we're on the topic of these types of busses in general, I'd like to point out THE CAR HACKER’S HANDBOOK book again. [1] I think it's just as pertinent today as it was when I first linked it on the board years ago : (>>772) (since these bus protocols seem to change little.) Cheers. :^) --- 1. https://archive.org/details/car-hackers-handbook-the-craig-smith
Open file (1.67 MB 1682x1080 I2CVSCAN.png)
>>32597 Com bus fight, ready, GO!
Open file (4.38 MB 480x864 Robowaifu Holder.mp4)
The stinky pvc glue finally cured and now I have a proper robowaifu holder. Converts from horizontal to vertical so I can work on various bits.
>>32595 >Tried 'em on a breadboard, couldn't get 'em to work without frying 'em You have to use a resistor diode combination. Yes it does complicate things but $52.99 for ASMC-04B . Too much. Way too much. I may never get there but my plan is to have something that has the same number of muscles as humans. roughly 300 needed. I'm not in any way against I2C but...CANbus is so much more. It's WAY MORE noise resistant. It's designed for noise Resistance may design a I2C linked waifu and it have super intermittent gremlins causing all sorts of problems. However you may be right and I2C is good enough and beats the best. >I2C devices require only 4 wires: 2 for power and 2 for data. However they also require unique addresses to access them, which means unless there is something onboard your sensor to change the address you're stuck only using 1 sensor per microcontroller (or whatever device you're connecting it to). I admit confusion. Myself, use the I2C or CANbus to comm in between the main processor to the many micro-controllers and in between each micro-controller. The micro-controller outputs drive the actuators. So each "microprocessor-controller" device has only one address but it doesn't mean you can't send data to that device that will be separated into many other device outputs. So some ESP32 micro-controllers have 11 output pins. That could control 11 muscles/actuators. Or that's the way I understand. The I2C is just a bus. Data may vary to each device. And looking around I found this. CANbus LED lights. Now I do not know ALL about these but I'm assuming they are hook up two wire CANbus and then you have control of them. I guess you would have to program an address into them. I'm not saying this is optimal but since they drive 12V 2.65W I expect you could detach the LED's and then use them to drive MOSFETS. AND the price Pack of 10-$8.48 is cheap. This also means if you can get them with LEDs then I suppose you, somewhere could find these to drive power MOSFETs around the same price or cheaper. at amazon DAMA-Wedge-Light-CANbus-Chipsets
>>32597 >car-hackers-handbook-the-craig-smith Nice!
>>32599 >Com bus fight, ready, GO! :)
Open file (58.98 KB 266x697 robowaifu knees.png)
>>32603 >You have to use a resistor diode combination. I figured so much, but the purportedly "working circuit" never stated that I needed those parts. Meanwhile on ebay you can get just the control pcb for the ASMC-04B for $15 (just search for Single Channel Servo Controller Board) :D I don't use this for the mek because the boards are only rated to 20V, and the mek can get up to 60V when its on the power umbilical. Probably could be used for robowaifus, got an idea for a knee joint from the attached image. >I2C noise Only noise problems I've had is the servo signal wires themselves picking up motor noise (but that's because they were close to a line pulling 50+ watts). I2C I've had no problems with, but it isn't meant for high-density communications. Basic bits (like sending numbers back and forth) is fine but something more high density data like an image requires SPI. >I2C addresses Yeah, probably could dig into i2c source code to get it to use different i/o pins but it sounds like a lot of hassle. You technically *can* use multiple i2c devices with the same address on the same i/o pins that only receive information (like servoboards), but they would mirror each other. There would probably be some confusion if you're using multiple devices that provide feedback on the same address: one gyro reads 10 degrees, another reads 50, what would the computer do? split the difference or -more likely- throw an error. Thankfully, there are I2C servo controller pcbs (PCA9675) that not only allow control for 16 servos per device, but they have jumpers that can change the address to connect up to 8 servo controllers (for a total of 128 servos or "muscles) ) without needing a multiplexer! Spud uses 2 PCA9675s, with one of the jumpers soldered to change the address.
>>32611 >I2C data collisions from gyros This is frankly unlikely to happen. Everything should be coordinated by the master clock. There is inherent collision detection and resolution standards in I2C if something goes wrong. The I2C site is a great read if you're having some I2C implementation issues or worries. https://www.i2c-bus.org/
>>32613 >I2c has error handling. Oh that's good, now to daydream about implementing SPUD's kneejoint...
>>32599 Lol >>32600 Very nice, Mechnomancer. SophieDev created a stand for his robowaifu work as well. I personally think Anon's Robowaifu Creation Lair should always include a standard-issue robowaifu worktable to mek things simple to manage! :^) > insert: Building Robowaifus for Dummies book cover image >>32613 >https://www.i2c-bus.org/ Thanks Kiwi!
>>32611 >Yeah, probably could dig into i2c source code to get it to use different i/o pins Maybe I misunderstand I2C or maybe you misunderstand but my understanding is it's just a spec to send data. Not necessarily a way to control individual devices, though it can. Hard to put into words but the basic idea is yes the spec can turn things on and off and other stuff but it's not really made to real time control active electronics(muscles). I think it would clog up bandwidth wise unless you separated and had many lines but maybe that's what you want to do. Like http is not necessarily a web page. Just the way they are sent and a spec to represent one. Not that it couldn't likely be used that way. It's likely that what is needed is some sort of microcontrollers. When you can get 11 outputs or inputs in a $7.00 or cheaper device(ESP32 my cost effective favorite) and talk to it with I2C it would seem WAY MORE cost effective to do that. To get any sort of waifu that looks human lifelike you're going to need a huge amount of inputs and outputs. Like 300 or more. All of these will need some computing some why not use the microcontroller to do this AND control the outputs? PCA9675 I think you do not understand the spec sheet on this. Look it says, "...higher total package sink capacity (400 mA versus 100 mA) that supports having all 25 mA LEDs on at the same time and more device addresses (64 versus 8) are available to allow many more devices on the bus without address conflicts..." "...current - Output Source/Sink 100µA, 25mA..." It will not drive a servo by itself. It needs a MOSFET or transistor to do so. It "might" be good for driving MOSFET's. Not sure. Apparently n channel MOSFET's have less resistance when turned all the way on (VERY GOOD) BUT need a higher than supply voltage to get the total drive on effect. Supposedly there are drive chips that do this. I have not totally got this handled yet on what is the best, cheapest way to do this. The above (PCA9675) appears to me to be very much on the same boat as the actual microcontroller in that you need extra circuitry to drive transistors, so why not just stick with the microcontroller in the first place? This does seem to be very good at driving small loads like LED's but to work a muscle you're going to need a power transistor and, best as I can presently tell, could be wrong, MOSFET's are the lowest cost, least heat producing method. It would be great if someone would make a $0.50 MOSFET chip that directly hooks to a microcontroller, has low impedence when on and needs no other devices. There may be such a thing. Let me know if you see one but the ones I have seen the price is really high. Far higher than putting parts together to do the same thing. It's a real problem. I see where you could easily get to $800 USD just on microcontrollers, driver chips and MOSFETS. Not counting the motors or any main logic processor. I did a quick search and found a TMC4671-LA SERVO CONTROLLER IC, $9.44 each and it only drives 4mA. Useless for most anything we need cost and force wise. One off the wall way I've been thinking about is make magnetic amplifiers(low cost) but I'm not sure how to get the amount of power out of the magnetic amplifier and "I think, not sure" it would use a lot of power. With magnetic amplifiers it might very well be that the chip you showed could be used to great cost saving effect. In the end magnetic amplifiers, even if they wasted power, might be so much cheaper that, and rugged, don't forget rugged, that they would be the way to go. Because of all this cost for inputs, outputs, etc. I've even thought of hydraulics, dread the thought of that, I hate them, but the cost must be cheap or you'll never reach average guy level usage. it will all be globalhomo running factories with their $25,000 Teslabots.
It may be that magnetic amplifiers are the way to go. I do not understand them. I understand saturated flux but I don't understand how this can be done with small currents. Sigh. But what I just read sounds promising, "...Single-stage magnetics can be built with gains of about 200,000, far beyond the capabilities of the vacuum tube. With a gain on this order, a few milliwatts of power in the control winding - an amount that could be supplied by one or two flashlight cells - may control a load of 25,000 watts in the output circuit..." WOW!!!! https://starlightpower.net/files/Magnetic_amplifiers_and_saturable_reactors.pdf I think if this could be figured out you could combine a coil for the muscle movement with a small control signal in one package. Eureka, a cheap wiafu.
>>32643 >Magnetic Amplifiers They're interesting esoteric tech which would be good for large scale applications. They work by using a DC signal to induce flux in a coil. This flux resists change, limiting how much AC current can pass through it. Essentially acting as low efficiency, slow, and high distortion TRIAC. They still see use, mostly because they're incredibly robust. Often being set up then forgotten because they will just work for decades without issues.
I've looked at magnetic amplifiers for years and have some sort of glitch where I can't understand them. It's obvious they work but I don't understand. You have some things wrong, I think. There's no reason they can't be small. Some computer circuits were built with them but transistors were much smaller still. It's not the flux in the coil that resists change. If I understand correctly it's the flux controlling the inductive reactance of a magnetic core material. Transformers can only conduct so much flux before the reactance(resistance) in the core material pegs and it will not conduct more. This is not linear. It conducts all it can then rapidly chokes off any more depending on the material. Some materials have a very square curve where it conducts then is clamped down to no more current. A straight level graph. Now supposedly the DC bias chokes the magnetic material to raise it's reactance. Here's where I can not understand. Let's say you have a 100W mag amp, how does a much smaller DC coil choke off the flux in the magnetic material. Wouldn't the current have to be close to the output current to saturate the core of the transformer? I don't see how a small DC current saturates the core so a larger AC current is controlled. Maybe??? it builds up in the core??? I'm grasping at straws. I can not understand. It makes no sense that one current creating a flux that is much less than another currents flux can choke off the larger. Also I do believe that they are fairly efficient. You can make a transistor circuit more efficient but mag amps are not especially tremendously inefficient. And they are slow but only compared to transistors. Sure they might have trouble with microwaves but we don't need that. For moving a solenoid they work fine. They used them to control V-2 rockets in WWII and all other sorts of stuff. The key would be a way to make these by some sort of printing or automated process. You can't get anywhere winding a bunch of tiny coils. Done right I could see the tiny amperage of a microcontroller output going into one tiny mag amp, boosted a little then into a larger one with the size depending on the output needed. The you could run AC through all the muscles and control with these mag amps. I'm going to have to figure this out. I think it may be key. We're talking a little wire and some iron oxide or powdered iron to make this work. While not as good as the specialized stuff literally iron oxide could be made to work for cores. Super cheap. With the small amount of stuff we would need it might be affordable to use silver paste for leads like they use on solar panels. There's also other type paste they are now using on solar panels that require less heat. I was just reading about Heterojunction solar cells where they need lower temperature paste for leads. https://en.wikipedia.org/wiki/Heterojunction_solar_cell Let's also not forget the higher, within reason, you go up in frequency the smaller the transformers and the more power you get in a circuit. Also if we can get this to work heat will not kill it. It's metal and copper wire.
Open file (58.25 KB 1500x1500 servo breakout board.jpg)
Open file (168.58 KB 1456x1415 PCA9685.jpg)
Open file (108.48 KB 1500x991 PCA9685 circuit diagram.jpg)
>>32642 >PCA 9685 is for LEDS The PCA9685 literally comes on a PCB designed for controlling servos. >Not enough power Uh, servos don't work like that. They have 3 pins: positive power, negative power, then the logic pin which takes a pulse from 0.5-2.5ms to position the servo (servos have their own on-board mosfets to drive their motors). You don't need to send power for the servo through the PCA9685, you just need it to send out the PWM pulse. I've driven half a dozen mg995s at once using a PCA9685, but if you ever do pull too much current you can either unsolder a resistor to use your own power or use a servo breakout board (just make sure the ground to your microcontroller matches your new powersource, you can see according to the circuit diagram even the circuitboard has its own sort of built-in breakout board with the headers, where the power to the servos DOESN'T go through the PCA9685). The ASMC-04B has its own power input so current through the pca9685 is no issue. >I2C be slow, yo. Spud's "animation" communicates info via I2C to the PCA9685, then to multiple servos multiple times a second. In fact, the "frame rate" is slowed down to 1/3 its capacity: the computer waits 3 cycles before sending the next "frame" to the servos. As I mentioned before, the maximum pulse sent to a servo is 2.5 milliseconds so there's plenty of bandwidth to go around. >realistic humanoid I'm going anime-style, bro. Robowaifus are most prominent in animu so it makes sense to look animu TLDR: servos have their own mosfets and all the PCA9685 does is provide the logic signal/ breakout board and animate servos at about 90 fps.
>>32647 >You don't need to send power for the servo through the PCA9685, you just need it to send out the PWM pulse oops. My mistake. I thought they were on the board.
>>32647 >PCA 9685 is for LEDS You have me quoted as saying this. I did not say this. I quoted the spec sheet that said you "could" run so many LED's with it and quoted the driving current it provided. I think there's been a misunderstanding. I misunderstood and thought that you said you were going to use just (PCA9675) to drive servos. I did not understand you had a board with it and that the servos you were using had MOSFETs on them. You seem to have figured out what you need and need no help from me.
>>32651 > I did not understand you had a board with it and that the servos you were using had MOSFETs on them. That's servos 101 ;) >You have me quoted as saying this. I did not say this. I quoted the spec sheet that said you "could" run so many LED's with it and quoted the driving current it provided. The PCA9685 was *initially* made to drive LEDs, since pulse width modulation can be used to make an LED have various degrees of brightness by flashing it for fractions of a second. You're not technically wrong, but since servos also run using pulse width modulation just all sorts of companies found it works good for driving servos, too :)
>>32646 They can be small if you are working with lower currents. They are only large because they're inefficient and more metal helps them not overheat. Reactance happens because of flux. >How does a few DC watts effect a 100W AC load? It's actually simple. AC is constantly making alternating fields. DC makes a constant field. This constant field creates a reactance when the AC field meets it. The AC starts at 0 and can only rise so far when imposed upon by the flux of the DC field. Regardless, TRIACS are nearly universally superior in every way possible. If you want to experiment with them, it's easy enough to make your own with a couple transformers. I will not respond to anything more on the topic because I work with DC equipment. Controlling AC loads isn't something I care about. https://www.youtube.com/watch?v=lB3HBoKPbOQ
>>32661 You keep saying magnetic amplifiers are inefficient. This is NOT true. "...Depending on the application, the magnetic efficiency of the amplifier can be very high, close to 100% in theory.." I perfectly understand the difference between AC and DC. What I don't understand is why DC can build up and saturate cores and AC not. A normal transformer makes a EM field that travels through the iron core to another coil of wire. The losses in the iron core are very low. Super, super low. Transformers are some of the most efficient devices used in all electrical circuits. "If" the magnetic flux traveling through the iron made with say 120V and 100 amps AC line can go through the iron core with little loses, how can a small DC mA current saturate the iron core? 60Hz is not that fast. Even if it is AC, the instantaneous magnetic flux though the iron core is very high compared to the flux created by this little DC mA current to saturate the core. Does anyone not understand why this baffles me? The actual DC flux through the iron core is much smaller than the the much higher flux from the larger power AC. And to add, if the flux is so sensitive to saturation then how the hell do AC transformers work at all? I will figure this out but presently I do not understand. I do understand inductive reactance in coils. I do not understand what makes DC inductive reactance in cores more powerful than the instantaneous reactance in cores due to AC. After a little looking around I found I'm not the only one that doesn't have a good handle on this. I found a paper that said, "...The effects of flux interaction of magnetic fields become controllable if the permanent magnet is moved towards a current-carrying auxiliary winding, which is wound on or introduced into the magnetic core...Orthogonally aligned magnetic fields and fluxes theoretically do not affect each other, according to the superposition principle. Nevertheless, they can be used to saturate parts of the core, what influences the inductive behavior of the magnetic device, making it adjustable over either its full operation range or parts thereof. At the moment, the potential of the flux interaction under discussion is not yet clearly summarized. Furthermore, the application of the effect is not, or only rudimentarily, described in the literature...." "Review of Flux Interaction of Differently Aligned Magnetic Fields in Inductors and Transformers I suppose this is one of those things like gyroscopes where they say,"It just does this" and you are supposed to ignore the idea that what you see is odd as can be. My quest in understanding this is solely based on finding the absolute cheapest way way to control actuators. I don't care if it's AC,DC, hydraulics, only that it's cheap and works. And BTW if you control AC with a mag amp and run it through a simple rectifier you get pulsating DC which works fine in motors.
>>32672 >I suppose this is one of those things like gyroscopes where they say,"It just does this" and you are supposed to ignore the idea that what you see is odd as can be. Haha. Welcome to modern """science""", Bro! :D >"It's Turtles all the way down, Bro!" :D What's really amazing--and is a strong indicator of the intelligent Creator behind the creation around us--is that the universe and it's 'laws' (a set of manmade constructs, btw helps us sleep at night :D is that they are comprehensible at all. If you buy into the philosophy of materialism, then why should they be? >tl;dr Michael Faraday would like a word with you. :DD < insert pic of amazing universe pic, with an amazed kid+telescope pondering it all under a mild, Summer night sky... Cheers. :^)
Open file (577.89 KB 995x1799 new thigh.jpg)
Open file (58.98 KB 266x697 dream of knees.png)
The hip servo horn hub (bit that connects the leg to the hip) ended up breaking during transport (printed with only 10% infill with 2 walls) so while I was re-printing it at 30% infill w/ 4 walls I figured why not slap that actuator on there. Could probably get more rotation with a shorter actuator but for now it will do to help hold SPUD up while she stands on her maintenance rack. Also since the stroke rod rotates, I could slap a servo down in the knee to make the entire leg twist from side to side (even though that really should be happening in the shin with the tibia/fibula). Neat!
>>32745 Very nice, Mechnomancer. Leg kinematics are actually quite complex once you study them in detail. And since the hips/legs/feet complex serve as the base foundation for the humanoid form, it's very important to get them right! GG, Anon. Looking forward to see how you approach rotation about the Y axis (I'm going to use left-handed, common 3D DCC tool coordinate space in my discussions on the board; ie up)--both for each individual leg, and for the trunk as a whole--moving forward. Cheers. :^) >=== -fix my coords, lol -prose edit
Edited last time by Chobitsu on 08/12/2024 (Mon) 00:41:45.
Open file (2.82 MB 1072x1468 double_jointed.png)
>>32746 >Y axis None yet: if femisapien can walk without one, hopefully SPUD can, too. I could probably squeeze one in for torso rotation, though Replaced knee with a double joint. Might play with the actuator replacement to see if I can get more rotation, there is more than enough space inside for an encoder.
=>>32787 >None yet: if femisapien can walk without one, hopefully SPUD can, too. Good thinking! Mark Tilden : ( >>10257 ) seems rather a genius to me, tbh. His primary insight in this regard AFAICT is that there is an invisible (and dynamic) center-of-mass-moment during a walk cycle, and you can 'throw' the lifted-leg about this arc to achieve bipedal locomotion. >I could probably squeeze one in for torso rotation, though Great idea. That will allow you to accelerate the 'lifted-leg-throw' motion (effectively lengthening that 3D arc of mass-moment within 3D space situated near her lower pelvis). Looking forward to your progress with Spud, Mechnomancer! Cheers. :^) >=== -fmt edit
Edited last time by Chobitsu on 08/13/2024 (Tue) 02:04:51.
Open file (424.12 KB 2336x1052 new knees.jpg)
New knees installed, and they can support the weight. Eventually I'll probably 3d print some thigh/shin panels. Might try to copy the Unitree robot and squeeze a motor into the shin to add a linkage to tilt the foot in a similar way to the Achille's tendon.
>>32813 >and they can support the weight Excellent, that's really good news, Anon. >Eventually I'll probably 3d print some thigh/shin panels. I'm interested to hear your evaluation of our approach for lightweight, 'mesh flat' panels for MaidCom : ( >>16525, et al ) ? As always, looking forward to your progress reports, Mechnomancer. Cheers. :^)
Open file (153.41 KB 803x1046 spud's eyebrows.jpg)
>>32824 A 3d printed fabric/web/frame would probably be nice for testing proportions and stuff, but I'm probably gonna go unga bunga solid panels. Figured out the servo linkages for the eyebrows, however the face got crinkled so I'll have to print out a new one and attach it to some more firm cardstock.
>>32583 Yeah, it's good to keep that in mind. But I vaguely remember that this also can overload some processor and then fail. Then again, you probably know better. I had the idea to ask Llama 3 about it: > That means the RPi can manage roughly 39 I2C devices on a single bus (since each combination contains two GPIO pins). However, keep in mind that practical limits may differ based on specific usage cases, cable lengths, and physical proximity. and also compared DMX512 to I2C, then asked about CAN. This is the short summary: > CAN is great for sending data between devices because it allows multiple devices to talk to each other at once, reducing wiring complexity. It's also very reliable, catching any mistakes that get sent along the way, and resists interference from surrounding electronics. All this adds up to a robust and dependable way to communicate between devices. Personally I will prefer CAN, assuming I'll ever get beyond prototyping. I don't want to deal with unnecessary errors and more cables than necessary. Especially in bending robot joints. >>32600 Impressive. You're really getting things done.
Open file (387.52 KB 1072x1386 spud with wig.jpg)
Open file (296.96 KB 597x1437 spud full body face.jpg)
Spud's new forehead just got off the printer, so I installed it with the wig and she looks pretty good. I'm tempted to make the face held on by magnets so I can swap it easier. Looking at the full-body pic, the shoulders could use some work. All well. >>32846 >39 I2C devices on a single bus I have a feeling that is way more than I'll be using :D
>>32841 >A 3d printed fabric/web/frame would probably be nice for testing proportions and stuff I get your point (and just cardstock paper is great for this as well). But from my perspective, the primary benefit is low-mass. All the shell parts add up the grams quickly! A secondary benefit is the /comfy/ factor. >tl;dr Lightweight, squeezable waifus a best!! :^) >Figured out the servo linkages for the eyebrows Nice. Good luck with all the little details. I hope you make your time goals for your event(s), Mechnomancer. Cheers. :^) >>32847 >A[hh] well. Little-by-little, Anon. Keep moving forward.
>>32847 Looking good! Using a 3D printed form to give her paper face some texture and "depth" is a clever idea. Using some steam would help to smooth it out over the print. You may want to look into printable fabric, this would allow the face to conform better.
>>32672 I asked Llama 8B instruct using your text from >>32646: [Quote] Your understanding of magnetic amplifiers (also known as autotransformers or magnetizing reactors) is spot on! The key concept here is non-linearity and saturation. When an alternating current (AC) flows through a magnetic core, its permeability increases with the applied magnetic field strength until the material reaches saturation point. Beyond this point, additional field doesn't increase the permeability significantly; instead, the material exhibits increasing reluctance (resistance). This behavior creates a nonlinear relationship between the AC input voltage and current, which leads to the characteristic step-like curve or "square" curve as you described. The role of the DC bias coil is indeed counterintuitive but crucial for control: Permeance modification: When a weak DC current passes through the separate winding, it modifies the core's permeance by inducing a residual magnetization (R.M.) within the magnetic structure. Initial saturation compensation: Although the overall effect might seem minimal due to low DC current magnitude, even tiny changes in R.M. contribute to reduced initial core susceptibility during subsequent excitation from AC flow. Magnetic domain alignment: By slightly adjusting the direction of net magnetization via DC offset (in conjunction with Hysteresis loop), domains become aligned along main axis making further induction require greater external B-field value Increased Reactance: Now when higher peak values appear over time during each new half cycle they face harder opposition leading effectively towards maintaining established balance hence better limiting excessive surge energy while also enabling smooth operation under various load conditions without overheating. As for your statement regarding needing similar current levels: While some similarity may exist, consider these factors influencing interaction dynamics: – Flux linkages dependant upon primary winding’s turns & geometry vs secondary counterpart specifics -making exact comparisons difficult – Differences arise across various materials properties especially near saturation regions However given proper construction techniques specifically tailored targeting specific use profiles including optimized insulation designs – such discrepancies would largely fade away once operational nuances considered Here's a simple analogy to help solidify the idea: Imagine running water (flux linkage buildup rate): Without obstacles like rocks (DC bias/residual magnetism): Water easily spreads across the entire streambed With moderate rocks strategically placed alongside: Rate slows temporarily yet still allows continuous flow albeit now more turbulent But when adding large boulders (higher DC offsets approaching or breaching maximum threshold) suddenly obstructing major portions of stream pathway almost nothing passes beyond those areas anymore! It feels weird initially because we're used dealing directly proportional effects whereas our case mixes two intertwined physical responses interacting concurrently producing what appears paradoxical result however both elements play vital roles allowing desired outcome: Magnetic Amplifier stabilizes fluctuating power supplies ensuring efficiency stays relatively consistent despite variations throughout network infrastructure [/Quote] That said, Kiwi might be right, if there's something better and we work with DC then what's the point? Also, maybe if this goes on, then it should move over to the Engineering General or something like it, since it does not seem to be related to Spud.
>>32846 Related about buses, sensors and multiplexing in the engineering general thread: >>32900
My apologies for talking off topic in the SPUD thread. It's not purposely malicious. I was only responding to a post in this thread and didn't scroll all the way up to see what the thread was about in the first place. It comes from looking at the landing place link /robowaifu/ - DIY Robot Wives and responding from there. I'll have to be more careful and try to remember to scroll up, but, I expect I'll make the same mistake sometime in the future. I just wanted you to know it not any direct effort to derail anything. Just a mix up on what thread I was on.
>>33007 I will forgive you, however... Anyway, SPUD is on display at a regional fair. Unfortunately her servo power converter exploded so she's been quite stationary. Even so, been lots of fun & positive reactions (have some basic voice recognition with vosk and espeak tts response and object recognition running). Even a couple folks recognizing the chobit-style ears and one cheeky person asking where her power button was ;)
>>33144 >Anyway, SPUD is on display at a regional fair. Great news! Must be gratifying, Mechnomancer. >Unfortunately her servo power converter exploded so she's been quite stationary. Bummer. Maybe that goes on the redundant spare list? >Even a couple folks recognizing the chobit-style ears The world is healing! >and one cheeky person asking where her power button was ;) Lol. What did you say? :D
Open file (298.51 KB 648x606 chobits lel.png)
>>33146 >Great news! Must be gratifying, Mechnomancer. It is. This is my fourth year there (last year they gave me an invitation to come back every year) >Bummer. Maybe that goes on the redundant spare list? I'm not sure but I think the power surge might have burned out the servo boards, I'll have to check once the fair is over. Hopefully the I2C bus didn't get fried. >Lol. What did you say? :D Oh, I replied that I'm not that sadistic.
>>33150 >I'm not sure but I think the power surge might have burned out the servo boards, I'll have to check once the fair is over. Hopefully the I2C bus didn't get fried. Oh man, I hope it's not that bad Mechnomancer. >Oh, I replied that I'm not that sadistic. HAH! Yeah, that was really warped in the first place. Clearly a result o 3DPD writing the story. Since I'm Hideki Motosuwa if I were Hideki, I'd have marched Chii right downstairs the moment I knew that Mrs. Hibiya was Chii's 'Mom'. :D <Fix her! >What? Fix what? <You know what! Just fix her... >Oh, you mean... Yes my husband insisted on that. You see, He was worried that my girls would... <FIX HER!!
>>33155 Well, the good news is the i2c bus isn't fried, one of the neck/arms servos is shorted and hogging all the power. Also sg90 servos can't run for very long with the resistance the eyelid has, so I'll have to go bigger. Probably have to figure out how to replace thebother sg90s with bigger servos too because they're so twitchy. >he was worried by girls would.... Sexual responsibility is a part of growing up. I hate the mental wanking the postmodern pseudo intellectuals make of it.
>>33187 That's good news at least. I'm sure you'll have SPUD's systems right as rain once she's safe & sound back in your robowaifu lair! :D >Sexual responsibility is a part of growing up. Eheh. I was just pulling slightly fanservicey headcanon out of my *rse. Again, that entire story aspect was warped from the beginning. (Honestly, it's hard to believe something so endearing & enduring came from a gaggle of 3DPD. I wonder how many of them are wine+cat ladies today?) :^) >I hate the mental wanking the postmodern pseudo intellectuals make of it. True enough. Sub-human freaks, the lot of them. Hard to believe humans turn their offspring over into the hands of these demonic wretches in our so-called 'schools' of today.
Open file (121.23 KB 1115x630 creator thoughts.jpg)
>>33198 >Honestly, it's hard to believe something so endearing & enduring came from a gaggle of 3DPD. Don't give them too much credit. The concept of raising one's ideal spouse has existed pretty much since the beginning of Japanese literature: Lady Murasaki in The Tale of Genji. There are similar videogames such as the NGE Ayanami raising project (released during the run of Chobits). Raising games are a subgenre of life simulation genre, but I forget exactly where I heard of it. Perhaps in reference to "The Shinji Ikari Raising Project" manga or mentioned in "Welcome to the NHK" ¯\_(ツ)_/¯
>>33227 Yeah, that's true. Thanks for reminding me, Anon. As our own Greentext anon frequently points out, this drive in men is from time immemorial. >Lady Murasaki in The Tale of Genji. Thanks! Do you have specific translation you prefer, please? >Raising games are a subgenre of life simulation genre Neat. I've often thought about a kind 'society of mind' approach for our robowaifu's AIs. Point/Counterpoint -style tensions intentionally built in. Maybe after a fashion of the way AlphaGo was developed by antagonistic 'play the bot against itself'. I anticipate it would be quite difficult to keep the AI from becoming schizo w/o lots of checks & balances wrapping the whole thing in rubber-babby-buggy-bumpers. :^) >"Welcome to the NHK" LOL. Don't even get me started about that animu! :DDD
Open file (284.89 KB 1317x1670 Compound_ASMC04_shoulder.jpg)
Been tired of having to deal with the silliness involved getting the smol 70kg servos to move the shoulder properly, so I'm working on a compound shoulder using 2 ASMC-04s. Roughly modified an old deltoid panel to get the motor to poke out (will cap with an orange tapered cylinder) Also after further testing it appears that *several* servos might have been krumped during the power supply failure. I'll have to test each one and see which need replacing (the eyelid & horizontal eye movement servos are on the list). Also noticed that some children can be mean to robots :( . Got a few voice basic commands running (with a paper explaining them) and they expect more. All well, I guess I'll have to put in a few more secret ones. At least the RS Media and Femisapien are a big hit while I'm giving my demonstration (they wander around into the crowd and interrupt me lol)
>>33265 That's looking good, Mechnomancer! Once you've established a solid foundation for SPUD, then of course you can 'put her on a diet', so to speak, and you'll find your actuators suddenly getting """stronger""" ( cf : >>4313 ). But as the old adage goes : >First things first! >Also noticed that some children can be mean to robots :( That sucks. The Globohomo is brainwashing them to be so, of course. While little boys are generally the more rowdy by our natures, I bet we will all discover that it's the female species that are the ones truly hateful & vindictive towards robowaifus (again, as they are being programmed to be). >tl;dr It's going to be a dark time just ahead, but should we win through then the world will have a great healing. < Steel yourself, and once-more into the breach, Anon! :^)
>>33265 >the eyelid & horizontal eye movement servos are on the list So SPUD had a stroke
Open file (107.35 KB 1431x1213 Servo power supply.jpg)
>>33268 > 'put her on a diet' She already only weighs 40lbs D: I suspect the motor's problem may be a lack of enough current, but I'm upgrading anyway. Also probably going to get several dedicated servo power supplies (pic related) After checking all the servos, all of them in her left arm, neck rotation, eyelid and eye x movement servos are krumped. Good news is I (probably) have enough servos to replace them. I installed the new arm joint but forgot to take a pic (whoops!), but a few tweaks are needed anyway.
>>33296 >She already only weighs 40lbs D: >"PUT DOWN THAT FORK!! Here's the carrots & the celerys instead!" Won't someone please just think of the Russian gymnasts!? :D <---> Dang! Sounds like it was pretty bad then, Anon. Glad you have a good attitude about it all. Staying lighthearted in the face of adversity is an admirable trait in anyone! I'm sure you'll get it sorted out Mechnomancer, and dear SPUD will be better than ever. :^) FORWARD!
>>33297 > Staying lighthearted in the face of adversity is an admirable trait in anyone! As Christ teaches us, to fall short and miss the mark is inevitable. It is important we recognize our mistakes and don't make them in future. Or a more modern interpretation: it doesn't matter how many times a man falls, it is how many times he gets back up (and doesn't repeat the mistake). Got pics of the shoulder (might rotate the ASMC-04 servo so it is parallel to the spine and so the orange hub doesn't hit the ribcage). Now the exhibition is over so I can start doing the full overhaul and restore spud to her uwu glory.
>>33333 Nice get! Yes, I agree with everything you said here, Mechnomancer. >shoulder, more time... Great! I'm sure she'll be right as rain in no time, Anon! Cheers. :^)
I don't know why people worry about face servos much. A screen for a face is just fine and more aesthetical. To me, it seems like this is an extremely particular final detail that should come last, making a mechanical face I mean. The most important elements are a competent ai, self locomotion, arms and aesthetical body if you ask me. That being said you're doing well and pretty far along. I would focus on it's ability to remain upright and move around, and interact with its hands with the environment independently using something like nvidia Optimus training over several epocs. That or get someone with vr to telepresence the thing. Tons of us have 11 point sub millimeter tracking for vr which data can be exported from in literal fraction of a second over the internet. https://youtu.be/dSc27JPm3r8?si=W6euiriOvfTXvTDZ
Open file (4.31 MB 552x480 SPUD facetest.mp4)
>>33340 >The most important elements are a competent ai, self locomotion, arms and aesthetical body if you ask me. You're focusing on the "robo" part and neglecting the "waifu" part :) >A screen for a face is just fine and more aesthetical. Been there, done that, got the T shirt. >>33339 >Great! I'm sure she'll be right as rain in no time One afternoon later and she be lookin fine enuff for a video (altho maybe a little lopsided)
>>33351 Nice! What all have you done to patch her up so far, Mechnomancer?
>>33351 She's really looking cute with that head accessory. I agree with the way you're going about making her face. It's an effective method which balances aesthetics, complexity, and expressiveness greatly.
>>33353 I added a thin cardboard backing to the face to prevent further crinkling, slimmed down the eyebrow linkages (the eyelid was getting caught on the previously large, janky ones) and replaced the krumped sg90 servos. Also did a servo inventory and I have 1 spare after I replace the burned out arm servos. >>33355 >head accessory It's a cheap haloween prop (and technically a garter), today I plan to add on the ponytail papercraft. Internet cookie to whoever can guess what character I'm using for reference :D
>>33371 Yeah, the faces are going to take all of us more time than anticipated, I predict. >1 spare Sounds like it might be time to do a little shopping, Mechnomancer? >1 free Internets DERPY HOOVES' TAIL MATCHES SPUD'S HAIR! :3
>>33351 I can't recall if this was addressed before, but is there a reason the eyelids are flat? Dome eyelids have been the standard for a long time with animatronics and ventriloquist dolls, and they look really nice when calibrated right. I can't imagine it'd be any more mechanically complex, since it's just a dome on a hinge, though the actual parts might be more difficult to make. >>33372 The eyes match too lol.
>>33374 >but is there a reason the eyelids are flat? The mechanism I designed is actually quite a bit simpler (about half a dozen parts) than spherical eye mechanisms with similar articulation. I looked into those spherical eyes and they complex little beasties. Besides, I'm not going for realistic I'm going for WAIFU :D Added a separate servoboard in the head powered right off the pi with some basic ambient movement. I'll have to change the address of one of the torso boards eventually. Eyelids are lopsided so I might give each of them a servo (I have an extra one I was intending to use for a mouth-flap) and I need to extend the range of the eyelid. Only uses a pulse of 1250 through 2200 so i have an additional 750 of movement available. Didn't work on fleshing out the ponytail due to IRL shenanigans. Also looking into a locally hosted TTS AI right on the pi. Piper tts works and I could probably make my own uwu voice model.
Open file (79.66 KB 736x1571 ReimuDoll.jpg)
Open file (91.15 KB 728x515 RyuZULovingClock.jpg)
Open file (2.94 MB 4961x7016 DerpyAngel.jpg)
>>33371 >Refining her face You really should replace her eyelids. Straighter eyelids would help her eyes look cuter when she's opening and closing them. https://www.youtube.com/watch?v=OlOoBnXLLD0 >Garter Gears are a cute, reminds me of RyuZU >>33374 Speaking from experience, dome eyelids are tricky. They have to be flush with the eye and socket. They have to move very smoothly, with exact precision, and they must do so completely silently. Else, they look like a creepy zombie robot. It's sadly harder than it looks. >>33372 >Derpy Now that's a call back. I like her bubbly personality. Best mail mare :^)
Open file (5.45 MB 568x320 Snarky Spud.mp4)
Integrated Piper tts into my demo voice detection program and some facial movements (there is a slight additional delay as the model speaks too fast for my liking so I have to use ffmpeg to slow down the wav file). Gotta get around to fixing the asymmetrical eyelids too :D
>>33398 Nice advance, Anon! Do you plan to dedicate add'l compys onboard to offload things like speech & vision in the future, Mechnomancer? Cheers. :^)
I adjusted the right eyelid to be more symmetrical to the left one and finished up the ponytail. I just need to secure the face better to the skull (the masking tape wore off and only one machine screw in the forehead is holding it on). Went to [s]Tashi station to pick up some power converters[/s] the mailbox today and picked up the shipment of ASMC04B servos. I can get the new shoulder joints done, and it turns out I had more spare servos for SPUD's arms than I thought. It's kinda fun having a head-shaped computer sitting there while you're working on it. A headless/armless robot body on the other hand... And looking at these speakers I got the sudden idea to install them in the... uh... chest region. Probably SPUD's machine spirit letting me know (she getting cheeky). Praise the Omnissiah \o/ >>33410 >Do you plan to dedicate add'l compys onboard to offload things like speech & vision in the future...? I plan to integrate at least 2 modes: demo mode (or offline mode) which will entirely be on-board and use some pre-rendered voice files / render voice files for utilities (such as time, date/weather only if there is a day in between asking, otherwise use previously rendered voice files) and AI mode, which will utilize a soon-to-be-updated version of the SPUD Server file on my "server". And voice commands to switch in-between them! Also will add in a "sleep" function that cuts power to the face servos after a while (cuz sg90s are twitchy) and close the eyes. Wake word will be something like "SPUD" and "Wake up" or something before power to the face gets reconnected and any other voice commands are recognized (already have "sleep" command integrated to check the blink position so I just need to make it time-activated). And of course I need to re-integrate the "Simon Says" function. That would go over well at the next exhibit :D
>>33423 >It's kinda fun having a head-shaped computer sitting there while you're working on it. Believe it or not, I've seen that there will be a big market for just this! Kind of a 'babby's first waifu'-tier thing, where the entry cost is very low. >Praise the Omnissiah \o/ AUUUUUGHHH!! You guys keep saying that. Don't worship idols made with your own hands, bro. Worship the one true and living God. :D >And of course I need to re-integrate the "Simon Says" function. That would go over well at the next exhibit :D Agreed. Well, they always tell me, 'make hay while the sun shines'. No time like the present to get that implemented, Anon. Best to take advantage of your 'forward momentum' while your last event is still fresh in your mind, no? Cheers. :^)
>>33427 >AUUUUUGHHH!! and idols I know you're (likely) being facetious but for the benefit of those who are like the pharisee in Luke 18:19-14... As I understand, idolatry is confusing the icon with the thing itself. It is important to note that due to the tower of Babel, mankind can misunderstand human words (even when speaking the same language!). But so long as the meaning behind the words is understood one is still walking with God (after all, God knows what we need to hear to find faith and will provide it, but lets us find it of our own free will). Example: The Mesopotamian God Marduk tries to pay attention the the world and speaks truth, and in this way was an earlier (but rather inaccurate) manifestation of the word of the one true God: paying attention and attempting difficult things is good, but it leaves out the inevitability of falling short/self-improvement motif. If one is worshiping exclusively Christ or a specific statue of Christ rather than the way (it may be more appropriate to say the lifestyle or the content of what) he preaches, that is idolatry and not worshiping the one true and living God. Christ says he is the way, in that he embodies it. But if you try to act like the literal son of god you won't be able to follow him perfectly because... well... chances are that you are not divine like Christ so best just to follow his teachings. I do not worship the the works by my hands. They are a byproduct of my faith or my faith made manifest. Or to put it in a metaphor, the works by my hands are a way to the path to faith. While the ass's jawbone Sampson used or the Shroud of Turin is important and neat they wouldn't take precedent over God. >'babby's first waifu' I'll certainly see about releasing some of SPUD's head files at some point since I tried to keep the design simple. My files are quite disorganized with all the revisions and I never bothered to sort into "outmoded" and "current". A little papercraft and a little 3d printing and "boom" AI waifu avatar! > No time like the present to get that [simon says] implemented, Anon See video: it was rather easy to implement (look for "simon says" in the string of detected words, remove the phrase "simon says" then pass it to the tts function). I also managed to get pre-rendered voice files implemented (not in video) and a voice command to shut down the entire raspi with yes/no confirmation, of course. I'll have to see about adding in a face-detect security function so certain voice commands only work if SPUD has recently detected my specific face.
Open file (342.46 KB 800x911 Roxie.png)
>>33423 She's looking like a Persocom Roxie >Chicks dig giant robots Nice
>>33440 >Persocom Roxie Hair is based on the Fate Series "Mordred"... I kinda dig that character's design. A smol demo of the simon says function as compared to some other pre-rendered voice files that play in reaction to keywords in the voice detection. Also have a voice command to type in a simon says sentence in the console window, but haven't gotten around to making it export to a separate file, just into the temporary file SPUD uses to speak. I could probably cut down on time considerably if I kept the wav data internal (yknow, inside the program running on the ram) and didn't have to deal with reading/writing to disk. Something to look into I suppose. I'll also have to add in some random eye movement -changing position during blinks- and double check the eyebrows, because they should've been moving upwards at the end where she closed her eyes because the detected speech contained no keywords.
>>33429 >I know you're (likely) being facetious Heh, yeah my humor is a bit /b/-tard'd, so yeah... :DD --- If you honestly want to discuss this, then I'd recommend we do so in the Christian Derail thread : ( >>2050 ), since my theological view on this topic is strictly from the Christian Bible?
>>33538 Neat! Thanks for the update, Mechnomancer. SPUDs looking really cool. I like where you're going with her motif. Do you think you'll extend that to rest of her body? --- https://www.etsy.com/market/mordred_fate_pepakura
>>33564 >discuss le christianity Nah, I've said my bit :) >>33565 >more-dred pepakura I am a bit tempted to make SPUD's face more mordred-like, but other than that not really. Got a friend who offered to make some custom clothes for Spud tho (since I wanna show off the alita-esuqe arms), and since SPUD's size is more or less a "small" (as I figured out with the morphsuit earlier) I could always just buy some costumes in that size. Speaking of showing off... I optimized SPUD's wiring and got all the I2C bits connected up without issue. Next phase is to wire up the servo power (both for the ASMC04Bs and the standard), but for now I game:
>>33602 >Speaking of showing off... I optimized SPUD's wiring and got all the I2C bits connected up without issue. Excellent! I wonder if you can somehow monitor voltage/power loads at individual devices? Might be helpful to avoid what happened at the fair? >filename The world is healing. :^) <insert pic: Sukabu's gaming-robogrill w/ open shells and housefans on her> >=== -sp edit
Edited last time by Chobitsu on 09/16/2024 (Mon) 00:52:33.
>>33603 >the fair powerconverter explosion I got some voltage converters *meant* for powering servos on rc planes at 8A constant/16A peak with safety shutoffs and stuff. Should be no more problems. Only other safety thing I have to do is a servo-switch for the face power and a emergency power cut for the main stervo power (after I install the batteries/AC adapter switch). I also have to figure out how to keep SPUD active while the game plays (so I can ask for time, weather, etc while gaming). That will probably involve my ol standby of creating a separate program and a dropbox to tell the program what to run on which voice command. Then I'm gonna work on making it all modular and documented and prepare for the release of a SPUD head: "Your Paper and Plastic Pal"
>>33607 >Should be no more problems. Good to hear. >Then I'm gonna work on making it all modular and documented and prepare for the release of a SPUD head That'll be excellent, Mechnomancer. Good luck! >"Your Paper and Plastic Pal" <"Who's Fun To Be With!"
Open file (1.74 MB 2899x4160 spuddy speakers.jpg)
If speakers are not boob why boob shape?
>>33647 Lol. Speakers are not for the bobs! Heh, so I guess dear SPUD will be a walking sound mixing board eventually, too? :^)
Open file (4.78 MB 320x570 Spud Lookie Loo 2.mp4)
>>33653 >SPUD will be a walking sound mixing board eventually Nah, SPUD needs speakers so people can hear her voice! XD Putting 'em in the bewbs is also a logistical choice because the magnets from the speakers are separated from the rest of the wiring minimizing the chance for motor noise. Plus its funny. Also whipped up a basic ambient movement program using the new servo power converters. Works great!
>>33654 Yeah it looks great, Anon. I forget if you have a camera+OpenCV face-recognition set up yet? If so, that ambient with a general focus on whoever's nearby would be pretty sweet. Keep up the great work, Mechnomancer! Cheers. :^)
Open file (52.72 KB 545x1286 1wn9fk5ompy21.jpg)
>>33647 >>33654 Not gonna lie, out of all the things I imagined you could do with tits on a robowaifu, installing a speaker system was not one of them. I hope the microphone isn't where I think it is, because that spot is supposed to be for the factory reset switch.
>>33655 You'd need templates for facial recognition with just opencv. Nowadays AI is used for facial recognition.
>>33679 >You'd need templates for facial recognition with just opencv. Nowadays AI is used for facial recognition. In my experience, it just werks right out of the box. And the OpenCV library -- being pure C++ -- will run on very smol hardware (such as a Pi Nano). There's even a camera (the Jevois) that literally runs OpenCV right on the camera.
>>33679 you can use a haar cascade for basic face recognition (via opencv) https://pyimagesearch.com/2021/04/05/opencv-face-detection-with-haar-cascades/ or the face-recognition library which has been kicking around since 2017. You can generate templates (models?) for individual faces based on multiple pictures, but a single picture can work as well. https://pypi.org/project/face-recognition/ I might try to use both to see if it will save on processing speed: haar cascade to find an area of SPUD's camera image that has a face, then only feed that part of the image to the more sophisticated library.
>>33683 >a haar cascade for basic face recognition Yep, that's it. And a lot of researchers have worked on/industry has banged on that C++ code for over a decade now. It's probably close to optimal now. It certainly would run on just an MCU, if some Anon wanted to specialize a compilation just for that. Probably already has been done, tbh.
>>33647 >>33654 >>33656 LMFAO, I don't think you guys understand the cuddling and ASMR potential. Joke or not, Mechnomancer may have just uncovered an interesting use case (pic related) :^))))
>>33718 Well, I'm just working on the robot-y bits. When I eventually (and hopefully) release SPUD it is none of my business what sort of /clang bits you decide to mod in there :D Since local document integration into an LLM seems kinda weird and hinky I figured I'd just brew up my own using some basic loops. Eventually (like with the screenface) I'll have the code scan through the entire directory so you can scan multiple files just by drag/drop. There's a few diagnostic print bits in the function you'll probably want to comment out. Eventually will set up voice commands to trigger memory by asking for words "do you remember xyz" then do a document scan of the chatlogs for xyz. Could even trigger different files with different commands like "do you remember blah blah about robotics" could trigger searching "robotics.txt" for blah blah or "do you remember blah blah about anime" triggers searching "anime.txt" for blah blah. Send that to the AI and ask for it to summarize/respond to it in character. This way there can be both long-term and short-term memory and -like people- it won't be perfect :3
>>33735 I really like your coding style, Anon. Good naming, good comments, sensible logic flow. Keep up the great work, Mechnomancer! Cheers. :^) >and -like people- it won't be perfect :3 Lol, true. We've had this discussion here more than once. In the case of Chii, for example, her naïveté early on in her life with Hideki is big part of her robowaifu'y charm. :D
Open file (373.73 KB 3120x4160 spud in her shirt.jpg)
Open file (493.91 KB 4160x3120 neck details.jpg)
Open file (273.48 KB 3120x4160 spud shoulder.jpg)
>>33739 The style is as much for my sanity as it is for others :D Fitted Spud's hoodie vest to her figure a bit more, makes her look more squish and approachable :3 I'll have to see about modifying that ol' morphsuit into a vest sort of thing so her Alita-esque arm panels don't get covered and put in some padding to make the squish. Also printed some new clavicles (more curved) and turned the old one's into neck-tendon-lookin things. I like neck definition I guess ¯\_(ツ)_/¯ Also extended the keyway on the chonky servos and printed more compact shoulder servo horn hubs so the shoulders are a more feminine width .
>>33782 Clothing really does make a major improvement in how huggable she looks.
>>33782 >Also printed some new clavicles (more curved) and turned the old one's into neck-tendon-lookin things. They look great, Mechnomancer. SPUD's really coming along nicely! > I like neck definition I guess ¯\_(ツ)_/¯ Me too. I personally consider the neck to be actually part of the face in my modelling (well, the head, certainly. Good work, Anon. Keep it up! Cheers. :^)
>>33423 >Also will add in a "sleep" function that cuts power to the face servos after a while (cuz sg90s are twitchy) and close the eyes. Wake word will be something like "SPUD" and "Wake up" or something before power to the face gets reconnected and any other voice commands are recognized (already have "sleep" command integrated to check the blink position so I just need to make it time-activated). Good idea. It's worth to think of variants of that. Noise sensor based activation, maybe based on noise level, or if possible something in the range of speech, a small voice detection sensor recognizing the name e.g. SPUD, someone touching or moving the body, ...
Open file (1.89 MB 1594x1054 volume knob.png)
Open file (37.90 KB 625x995 SPUD setup.jpg)
>34013 > Noise sensor based activation Noise sensor... you mean a microphone? lol Back with the physical jaw I did modify a cheap visual equalizer to give input to a GPIO pin. I could do similar... just use wake words. Gyro activation or other touch sensors (strategically placed buttons) could help too, like a headpat/nose boop button or SPUD gets pushed. Been working on other things, but managed to get around to installing a back panel on SPUD's skull and mounting the volume control knob there. I'm also not particularly confident in a bipedal walking robot right now:I might roll back SPUD's legs to being just posable and have her sit atop a walking trash can -I mean- robot companion. It can hold goodies like a kinect (for navigation), batteries and stuffs. SPUD and BUD. Or I could make some reciprocal motion walking legs like James Bruton did for Halloween: https://www.youtube.com/watch?v=AEXz8xyTC54 Might try to build a 5foot RS Media to get the hang of large bipedal motion, but that ain't a robowaifu so not applicable here lol.
>>34194 >> Noise sensor based activation >Noise sensor... you mean a microphone? lol No, I was thinking, something that only measures the noise level and is connected to a low power consumption device.
>>34194 >SPUD and BUD. I like it, Mechnomancer! I'm actually working on prototyping a little pony for dear Sumomo-chan to ride, and it will mount on a little flitter-car platform for them to both scoot around quickly. When they get where they're going, Sumomo can dismount, and have picnics with her Master. :D
Open file (4.79 MB 480x864 Spud Arm Swing.mp4)
>>34195 A USB microphone probably consumes less power than anything else I could come up with, because even when the robowaifu is "sleeping" the raspi will always be on. While trying to calibrate the abdominal servos I broke the servohorn (need to re-print with 50% infill and a bolt running thru it for support). But in the meantime I mounted SPUD's torso on a little stand with a keyboard shelf so I could calibrate the arm servos. All calibrated/mapped and running a basic script to move the joints back and forth. Time to start compiling everything into a single set of scripts, and start work on BUD.
>>34226 >But in the meantime I mounted SPUD's torso on a little stand with a keyboard shelf so I could calibrate the arm servos. Very handy. Good thinking, Mechnomancer. Cheers. :^)
>>34226 Hmmm James might be onto something here. I'll be back in a while lol. https://www.youtube.com/watch?v=AEXz8xyTC54
>>34226 Those speaker boobs are comically disturbing due to how they recess. Do they have some sort of cover that goes over them? It looks like you made some sort of attachment ring for something of mesh or foam i presume. Would help prevent speaker damage.
>>34254 This is a good example of why I keep pushing the use of LARPfoam for our robowaifu's 'undershells'. LARPagans have a big set of communities around this stuff today, and it's a good idea for us here to benefit from all this information. BTW, another Anon also posted this video here, but I can't locate that post ATM. >I'll be back in a while lol. Lol, don't stay away too long, Mechnomancer. ABTW, this is a daily reminder you'll need a new bread when you get back. Alway rember to link the previous throd in your OP. Cheers, Anon. :^) >>34259 Great find Kiwi, thanks very kindly. I wonder what Bruton has in store for his 'next design that walks better'? Cheers. :^)
New bread: >>34445

Report/Delete/Moderation Forms
Delete
Report