/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality!

We are back again (again).

Our TOR hidden service has been restored.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“Decide carefully, exactly what you want in life, then work like mad to make sure you get it!” -t. Hector Crawford


Open file (2.28 MB 320x570 05_AI response.mp4)
Open file (4.77 MB 320x570 06_Gyro Test.mp4)
Open file (8.29 MB 320x570 07B_Spud functions.mp4)
Open file (1.06 MB 582x1446 Bodysuit.png)
SPUD Thread 2: Robowaifu Boogaloo Mechnomancer 11/19/2024 (Tue) 02:27:15 No.34445
This first post is to show the 5 big milestones in the development of SPUD, the Specially Programmed UwU Droid. You can see the old thread here: >>26306 The end goal of SPUD is to provide a fairly high-functioning robot platform at a relatively low cost (free code but a few bucks for 3d print files) that can be used for a variety of purposes such as promotional, educational or companionship. All AI used is hosted on local systems: no bowing to corporations any more than necessary, thank you. Various aspects of the code are/will be modular, meaning that adding a new voice command/expression/animation will be easy as making the file, naming it and placing it in the correct folder (no need to mess around with the base code unless you REALLY want to). While I'm researching more about bipedal walking I'll be making a companion for SPUD to ride on, so it might be a while before I return to the thread.
>>38551 >Shielded twisted pairs are great for resisting EM interference. This. I'm hoping we can get away w/o the cladding on short runs, but we'll have to test this out in the realworld, in situ.
Open file (5.82 MB 320x240 mechduino_4boards.gif)
>>38553 >"Why choose just one, when we can have 9 more for ten-times the price!?" I meant the cables only have have wires connecting 7 of the 15 pins. Half of the pins on each end have no continuity between them. I didn't realize I already had an arduino uno in Carry, gonna do her tests after posting here, because I spend my morning getting a multi-board blinka test running (picrel) for mechduino. The code could probably be more streamlined but at least I find it pretty clear to use. Since communication with the boards is sent as a 2 lists of pin states (list of pins to set to 0, ,list of pins to set to 1) you have to clear the lists every time you begin a loop. I could probably make something happen in the pin set functions to automagically remove pins from one list of they appear in the other but honestly it isn't worth the hassle.
>>38570 Ahh, got it. I meant as much with my little (realworld) joke: telcos have been running trunks with dark-fibres for decades now (often along train track ROWs & similar). The costs of the long-a*rse fibre runs are still basically 1:1 (or 10:10, heh :D -- the amortization is present b/c the pull is is an expensive part of the whole proposition (which would basically be the same whether just 1 fibre, or 1'000 fibres). Make sense, Anon? :^) >gonna do her tests after posting here Neat! >but honestly it isn't worth the hassle. Ehh, just make it work for your needs, as-is would be my suggestion. For RW Foundations, however the goal is generality. Therefore, so-called 'immediate mode' is pretty much a given. Nice to see your progress, Mechnomancer, as usual. Cheers. :^)
Well nuts pyfirmata is fubar so I gotta re-do Carry's code with Mechduino. All well at least I know that Carry can Idle for about 420 minutes on a full charge.
Smol quick update: Got carry working via mechduino after finding out 2 of the USB ports in the raspi case case were dead. She has an unteathered runtime of approximately 150 minutes when consistently driving. Since I plan to use the same type of powersupply for Sploot, can expect that mechdog to have a roughly 40minute runtime unteathered, and as a Sploot/SPUD centaur perhaps a 30 minute runtime.
>>38572 >>38574 Nice to know realworld performance stats. Do you currently have any estimates as to the lifetime characteristics of the batteries themselves? Cheers, Mechnomancer. :^)
>>38575 >U got battery life estimates? Not really, but given the weight it is lead-acid battery and in my experience those are troopers. And according to the manual it can be easily replaced. I added the board id option in Mechduino so if a person needs to use mechduino as multiple boards they can easily be distinguished (board ids are set when you upload arduino code). Takes a few extra lines of code to automagically detect but I'll be sure to put that in the example. Mechduino is pretty much ready for a public release, just need to find the time to set it all up on github. Once I get my github lair set up I'll work on Pringle's durability testing and (eventual) release P: >=== -rm wordfilter spew -patch post
Edited last time by Chobitsu on 05/22/2025 (Thu) 22:05:58.
>>38666 LOL. Sorry, you've run-afoul of one of the /cow/ wordfilters here, Mechnomancer. :D I'll edit out the BS, and then please fill me in on the right words, and I'll patch that up afterwards, too.
>>38670 honestly idk what I put there originally lol, I guess just replace the "[...]" with "person needs to use mechduino"
>>38679 >honestly idk what I put there originally lol Lel'd. :D Done.
Open file (117.37 KB 1200x1200 hexbug black widow.jpg)
If Sploot can't handle the weight of SPUD, I'm totally gonna build a platform like this for SPUD to sit on and base the walking mechanism on the hexbug black widow (on second thought I might just build a massive spider anyway cuz my side-projects are just like that lol) : https://www.youtube.com/watch?v=JNSiiFQsh54
>>38692 Spider girl robowaifus!
>>38692 Spooderbros are cool yet spoopy. Any idea what you're gonna name it yet?
>>38694 Idea: Arachne
>>38774 Lol. >you know remember Arachnophobia. :D
While doing some R&D for SPUD's new body paneling I stumbled across a technique to create a humanoid form rather cheaply without the use of a 3d printer... probably about $30 and 2-3 hours assembly time (depending on your skill). I'll post more details once I get the SPUD version complete. It would be a good thing to release with Pringle: not requiring a 3d printer would make Pringle more accessible to a wider scope of Anons, making Pringle a true model T robowaifu :D
>>38947 That sounds awesome, Mechnomancer! Maybe you can do an in-depth detail on this in our Materials & Production bread : ( >>37774 )? Cheers. :^)
>>38947 Is it foam sheets?
Open file (152.16 KB 782x2190 SPUD persona test.png)
While taking a break from perfecting my super duper ultra secret (at least until I post it lol) over 9000 awesome robowaifu body budget -no >>38951 greer not foam sheets unless you really really want to use em- I followed this guide: https://buyzero.de/blogs/news/deepseek-on-raspberry-pi-5-16gb-a-step-by-step-guide-to-local-llm-inference and got a somewhat decent LLM running on a raspberry pi 5 16gb. Depending on the model your results may vary: you will have to explore the ollama models and experiment to see which one is right for you. The attached is a sample response with some statistics and the "thinking" (aka the LLM having an internal monologue) with the deepseek model (default model in the article). I found a model kinda good at doing SPUD's persona without the Deepseek model's gigabrain energy but I forgot which one and am too ̶l̶a̶z̶y̶ busy to check right now. I'll try to swap out SPUD's main computer with the Pi 5 when I replace her eyelid rack (no blinking at the moment oh no!)
>>39206 Good timing lol Now I'm curious to see this new construction method.
>>39206 POTD Nice work, Anon. Really looking forward to your latest advances with dear SPUD. Cheers. :^)
>>39229 Thanks for the update, fren Mechnomancer. Godspeed.
@Mechomancer, you should make an Odysee page and/or Neocities website for your final releases. If it can fit, also use catbox for archival.
Open file (20.10 KB 480x360 hqdefault-787845404.jpg)
>>39234 oh dear looks like my less detailed post about the raspi LLM got restored, kinda puts a little redundancy in the thread. All well. I asked chatgpt how to use the python Ollama library to stream the LLM responses sentence by sentence with the goal to reduce the appearance of latency. To explain further, there will be a pause when the initial request is sent to the LLM and the TTS generates, then while the first sentence is being spoken -aka audio is playing- the raspi will be working on generating the text & audio for the second sentence (if that makes sense). This will (hopefully) eliminate freakishly long pauses if the LLM decides to give the user a paragraph and add the potential to cut off a response between sentences. Still have to actually implement the code but it seems to at least be possible. >>39310 I'll probably have 2 versions for Pringle: basic modular assistant and LLM then just plop the code it on github , because the SPUD line is for hospitality/greeting and not ACKSHUALLY a robowaifu... but I can't control what you decide to do with the code n stuff. ;) Tbh the entire release package probably wouldn't be all that big, maybe if there are some hires .stls but it would probably be all under 100mb.
>>39315 >oh dear looks like my less detailed post about the raspi LLM got restored, kinda puts a little redundancy in the thread. All well. I'll delete that if you want, Mechnomancer? >but it seems to at least be possible. That will be really good news if it's feasible to queue-up a 'train pattern' for responses like that. >Tbh the entire release package probably wouldn't be all that big, maybe if there are some hires .stls but it would probably be all under 100mb. Surprisingly smol to my thinking. GG. Cheers. :^)
>>39318 >I'll delete that if you want, Mechnomancer? Sure yeetus the f̶e̶e̶t̶u̶s̶ postus >Surprisingly smol to my thinking. I mean those are the files I'd have to host. Still would have to install python dependencies (20mb max I think) and the Ollama model (around 4 gigs) but you get those with pip and ollama and I'd provide instructions like article I linked :D
>>39320 Neat! >Ollama Can your arrangement work with llamacpp instead? It's lighter than ollama by all accounts. Plus it's dev community is pretty fiercely working on it atm. I expect it to improve further.
Edited last time by Chobitsu on 06/15/2025 (Sun) 04:22:16.
>>39335 >llamacpp Idk, install instructions look less straightforward. I'd have to try it out. This article looks handy tho: https://learn.arm.com/learning-paths/embedded-and-microcontrollers/llama-python-cpu/llama-python-chatbot/ Especially since it has instructions right on it for using llamacpp with python. I think a good idea that before SPUD properly shuts down, I feed the entire conversation (up to 16k tokens, of course) to the Deepseek thinky model to summarize the conversation and modify the base prompt accordingly, that way the character AI will change or "grow" as one talks with it. This would be similar to the function of human sleep. However I'm a while away from doing that as not only am I just barely getting my feet wet into maxxing the Raspberry Pi (found out how to overclock hooray!), but I've been busy with personal matters, as well as touching (and mowing) grass.
>>39387 Nice! I thought of something similar, and your AI summarization will solve the problem I realized of the prompt getting too large over time.
>>39387 >before SPUD properly shuts down, I feed the entire conversation (up to 16k tokens, of course) to the Deepseek thinky model to summarize the conversation and modify the base prompt accordingly That seems like a really good idea if feasible, Mechnomancer. Please keep us up to date on your progress. Thanks for the link resource BTW! Cheers. :^)
Robowaifu v 1.0 "Carry the Workshop Waifu" retrofit is 95% completed. Wiring completely re-done to utilize my Mechduino library. Had to replace a few of the motor drivers, too. Now just need to tweak the back of her skirt (not shown) and fix some issues with her main drive slipping occasionally. She doesn't have voice commands but that is something easily implemented in future. Maybe I'll try to deploy Pringle script on her? idk. She has a lot of empty space inside and -as I mentioned before- a battery pack (UPS) that can give about 7 hours idling time or 2 hours constant driving time. So I can probably have her test things like autonomous navigation and stuff. Gonna move on to some other non-waifu projects (unless you consider the 600lb mek a waifu lmao) before getting SPUD back operational.
>>39656 Nice!
It seems that after overclocking the pi, the Ollama deepseek LLM makes the pi freeze up when in thinking mode. But using a wizard uncensored model (I forget exactly which) conversationally (without thinking) has no such issues. I'll have to experiment to see if the wizard model can think. I'm also halfway done writing a terminal-based servo animation program (read/write servo positions to text file and have motion curves between the frames). Deepseek also seems to have latched on to John Wayne's football career and hallucinated a 2018 interview. Still, this is quite a lot of "thought" for such a little computer :)
>>39856 Is there any real benefit to thinking mode, especially on Deepseek? It seems like all it does is go "ummmm, let me see, yeah that could be it, yeah"
>>39860 Well it flexes on the midwits who think they're so smart but don't have an internal monologue lol I got ollama working through python with chat history working and also got Pipertts running. Just have to merge the two then I'll have some video footage ready (then work on getting the audio playing while the llm generates more text). I'll also probably implement Pipertts into Pringle and while I do that I suppose I'll get around to that long-procrastinated durability testing.
Open file (7.68 MB 576x264 SPUDs Brain.mp4)
Implementing properly ordered TTS was simpler than I thought. Currently the program lets the program generate the response until a period or comma is detected in the token then passes that sentence (or sentence fragment) to the TTS engine (Piper). As you can see it doesn't work well for that initial run-on sentence. I would calculate by # of spaces (hence words) but sometimes the llm generates fragments of words in a token so words get split up, so I'd have to do a bit more string manipulation to do TTS by wordcount rather than punctuation. Works good enough for now and I'll fancy it up later.
>>39964 Great progress, Anon! I'm looking forward to seeing your further progress with this sub-project, Mechnomancer. Her voice sounds pretty good already, so yeah. Keep moving forward! Cheers, Anon. :^)
Open file (57.05 KB 679x678 2b costume.jpg)
>>39965 Not a sub-project, this is gonna be SPUD's new brain :) I just need to a) integrate my mouth flap library and b) find a local object recognition library and we can start making youtube videos playing games together. "World's first real android vtuber" or some such. Also might get SPUD a 2B costume, cuz that will go over well.
>>39964 >>39966 Nice! I can't wait until it's completed. I think the 2B costume will look really nice on her.
>>39966 Oh, got it! >SPUD<=>2B That'll be ebin! DOOEET!! :DD
Open file (6.87 MB 560x320 Basic lip synch.mp4)
I knew I'd find a use for Mr Fishlips: a bench test of the lip synching. Unfortunately seems to have skipped a sentence but all well nobody is perfect. During the pauses Spud's face will assume a thoughtful expression so the user knows the text is being generated. If you listen closely you can probably hear my cat meeping for pets (don't worry he got them because he's a good boy). Need an adapter for the picam (also find an object recognition library because Mediapipe doesn't work on 64-bit raspberry pi), and integrate voice to text and the brain will be ready for installation... she will have a permanent buddy in the form of the Mars Pro Bluetooth speaker.
>>39975 POTD Neat! GG, Mechnomancer. Looks like the mouth is quite responsive, and well-synced with the voice's audio output. I'll be intredasted to see your efforts at tying this all together for dear SPUD. Cheers, Anon. :^)
>>39976 hehe I'm doing a lot of bench tests before putting this new brain in SPUD. The pi5 was expensive and I don't want to accidentally burn out another pi again. Pro tip: you cannot connect a Pi and a powersource for your servoboards from the same battery via dc power converters (connecting the grounds across the converters). There will be an electrical short. Carry proves that it is possible with AC power converters: AC to DC converters use a transformer so there is no actual physical connection between the two powersources. I got a "thinking" indicator now on the benchtest that consists just of a servo: at a position of 500 when not thinking, and at 2500 when thinking. Not fancy enough to warrant a video but will be a placeholder for thinking expressions.
>>39978 Ouch! We need some kind of "electronics class" here on /robowaifu/ . At least moar links in our RWUNI bread or something. I'm considering picking the Embedded Programming thred back up from Anon, since he's been long-absent now (cf. >>367 ). Great idea about the 'thinking expression'. This might help ease the experience of having response delays, for non-tech'y normals. --- Regardless, I'm glad that electrical lesson is behind you and you're pressing on Anon. Forward.
Edited last time by Chobitsu on 07/26/2025 (Sat) 03:15:34.
>>39981 When providing an external powersource for servos (via a breakout board or inside the board itself) the grounds of the pi and the external powersource are supposed to be connected, that way there is a complete circuit. I was doing what the experts say you're supposed to do, and it cost me a pi (I managed to get the latest copies of the programs off there before it completely burned out tho). tRuSt tHe eXpErTs bRo! SPUD's 2b outfit arrived today along with some bits that should make installing the brain easier. I suppose I'll have to figure out how to give SPUD a 2b level bum and boobles lol.
>>39986 >tRuSt tHe eXpErTs bRo! Heh. It'll be intredasting to see your system as a whole. >SPUD's 2b outfit arrived today along with some bits that should make installing the brain easier. I suppose I'll have to figure out how to give SPUD a 2b level bum and boobles lol. I recently saw some "biodegradable peanuts" (foam pellets) and was thinking of your robot squish quotient. Very lightweight & poofy, but probably not too lumpy.
Open file (6.13 MB 553x428 encoder_gif.gif)
I've recently been experimenting with creating motor encoders. If I can control 2000lb winches with precision, it would certainly open up more options for smaller, robowaifu motors (like those linear actuators I was hoping to put in SPUD's knees last year). Its simply gutting a servo -except for the potentiometer- and connecting it to an Analog to Digital Converter (currently an arduino mega but there are some smaller I2C boards available). Use code to juggle variables and you got yourself a closed-loop motor control system. Unfortunately sg-90s don't work really well because their plastic gears tend to get stripped so I've been gutting larger servos that have metal gearboxes. Maybe if I tried metal-geared sg90s. I never did actually check if the sg90's pot value changed though so it might just be me doing a dumb.
>>40010 Lol, that just looks cool for some reason Anon. :D I'm sure you can figure this out, Mechnomancer. If this goal is in your budget of time & other resources then do it! Cheers. :^)
>>40013 >Lol, that just looks cool for some reason Anon. Of course it does, that is the prototype for the worlds first open-source mech joint ;) And since that has been around since (way) before Carry, version 2.0 will be to the current one what SPUD is to Carry. Most of my non-waifu projects are related to each other since they all use raspberry pi & python n stuff. It's really just a difference in scale (if I can get my powerarmor project walking I can scale down the mechanism for bipedal walking eg walking robowaifu). I'm thinking maybe just sewing together some padding and putting foam sheets over it to make SPUD's lovely lumps not so lumpy. I kinda half-assed it last time ^-^;
Wig arrived, looks good for 5 seconds of styling. Once I get around to sewing together some assets I'll show how she looks in the dress.

Report/Delete/Moderation Forms
Delete
Report