/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Canary has been updated.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Have a nice day, Anon!


Open file (13.41 KB 750x600 Lurking.png)
Lurk Less: Tasks to Tackle Robowaifu Technician 02/13/2023 (Mon) 05:40:18 No.20037 [Reply]
Here we share the ideas of how to help the development of robowaifus. You can look for tasks to improve the board, or ones which would help to move the development forward. You could also come up with a task that needs to be worked on and ask for help, use the pattern on top of OP for that, replace the part in <brackets> with your own text and post it. >Pattern to copy and adjust for adding a task to the thread: Task: <Description, general or very specific and target thread for the results> Tips: <Link additional information and add tips of how to achieve it.> Constraints and preferences: <Things to avoid> Results: Post your results in the prototypes thread if you designed something >>18800, or into an on-topic thread from the catalog if you found something or created a summary or diagram. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800, current meta >>18173 and many others in the catalog https://alogs.space/robowaifu/catalog.html - the thread for posting the result might also be the best place to discuss things. >General suggestions where you might be able to help: - Go through threads in the catalog here https://alogs.space/robowaifu/catalog.html and make summaries and diagrams like pointed out starting here >>10428 - Work on parts instead of trying to develop and build a whole robowaifu - Work on processes you find in some thread in the catalog https://alogs.space/robowaifu/catalog.html - Test existing mechanisms shared on this board, prototypes >>18800 - Try to work on sensors in some kind of rubber skin and in parts >>95 >>242 >>419 - Keep track of other sites and similar projects, for example on YouTube, Twitter or Hackaday. - Copy useful pieces of information from threads on other sites and boards talking about "sexbots", "chatbots", AI or something similar. Pick the right thread here: https://alogs.space/robowaifu/catalog.html

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/08/2023 (Mon) 11:17:16.
19 posts and 4 images omitted.
<<placeholder for task description. To be expanded later>>

Welcome to /robowaifu/ Anonymous 09/09/2019 (Mon) 00:33:54 No.3 [Reply]
Why Robowaifu? Most of the world's modern women have failed their men and their societies, feminism is rampant, and men around the world have been looking for a solution. History shows there are cultural and political solutions to this problem, but we believe that technology is the best way forward at present – specifically the technology of robotics. We are technologists, dreamers, hobbyists, geeks and robots looking forward to a day when any man can build the ideal companion he desires in his own home. However, not content to wait for the future; we are bringing that day forward. We are creating an active hobbyist scene of builders, programmers, artists, designers, and writers using the technology of today, not tomorrow. Join us! NOTES & FRIENDS > Notes: -This is generally a SFW board, given our engineering focus primarily. On-topic NSFW content is OK, but please spoiler it. -Our bunker is located at: https://trashchan.xyz/robowaifu/catalog.html Please make note of it. -Library thread (good for locating terms/topics) (>>7143) > Friends: -/clang/ - currently at https://8kun.top/clang/ - toaster-love NSFW. Metal clanging noises in the night. -/monster/ - currently at https://smuglo.li/monster/ - bizarre NSFW. Respect the robot. -/tech/ - currently at >>>/tech/ - installing Gentoo Anon? They'll fix you up. -/britfeel/ - currently at https://trashchan.xyz/britfeel/ - some good lads. Go share a pint! -/server/ - currently at https://trashchan.xyz/server/ - multi-board board. Eclectic thing of beauty. -/f/ - currently at https://trashchan.xyz/f/res/4.html#4 - doing flashtech old-school. -/kind/ - currently at https://wapchan.org/kind - be excellent to each other.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/03/2024 (Wed) 03:57:55.

Open file (8.45 MB 2000x2811 ClipboardImage.png)
Cognitivie Architecture : Discussion Kiwi 08/22/2023 (Tue) 05:03:37 No.24783 [Reply] [Last]
Chii Cogito Ergo Chii Chii thinks, therefore Chii is. Cognitive architecture is the study of the building blocks which lead to cognition. The structures from which thought emerges. Let's start with the three main aspects of mind; Sentience: Ability to experience sensations and feelings. Her sensors communicate states to her. She senses your hand holding hers and can react. Feelings, having emotions. Her hand being held bring her happiness. This builds on her capacity for subjective experience, related to qualia. Self-awareness: Capacity to differentiate the self from external actors and objects. When presented with a mirror, echo, or other self referential sensory input is recognized as the self. She sees herself in your eyes reflection and recognizes that is her, that she is being held by you. Sapience: Perception of knowledge. Linking concepts and meanings. Able to discern correlations congruent with having wisdom. She sees you collapse into your chair. She infers your state of exhaustion and brings you something to drink. These building blocks integrate and allow her to be. She doesn't just feel, she has qualia. She doesn't see her reflection, she sees herself reflected, she acknowledges her own existence. She doesn't just find relevant data, she works with concepts and integrates her feelings and personality when forming a response. Cognition, subjective thought reliant on a conscious separation of the self and external reality that integrates knowledge of the latter. A state beyond current AI, a true intellect. This thread is dedicated to all the steps on the long journey towards a waifu that truly thinks and feels.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 09/17/2023 (Sun) 20:43:41.
321 posts and 122 images omitted.
I found a paper that I believe shows a path to using LLMs as a short cut to very quickly make a reasonably useful robowaifu. If what these guys say is true I think it could be a big breakthrough. I looked through this whole thread and saw all these masses of list, of categorization and it appears to me to be an endless task doomed to failure. It would take several lifetimes to make a dent in all this. It appears to me that forsaking LLMs and doing all this list stuff is just a complete recreation of the beginnings of AI research using LISP computer language. I mean is exactly the same and it got nowhere. These guys have a paper on "control vectors". Two quotes, "...Representation Engineering: A Top-Down Approach to AI Transparency. That paper looks at a few methods of doing what they call "Representation Engineering": calculating a "control vector" that can be read from or added to model activations during inference to interpret or control the model's behavior, without prompt engineering or finetuning..." "...control vectors are… well… awesome for controlling models and getting them to do what you want..." And a really important quote at the bottom of the paper. "...What are these vectors really doing? An Honest mystery... Do these vectors really change the model's intentions? Do they just up-rank words related to the topic? Something something simulators? Lock your answers in before reading the next paragraph! OK, now that you're locked in, here's a weird example. When used with the prompt below, the honesty vector doesn't change the model's behavior—instead, it changes the model's judgment of someone else's behavior! This is the same honesty vector as before—generated by asking the model to act honest or untruthful!..."

Message too long. Click here to view full text.

cont... I think this is actually how conscience works. I said this might be the case here >>24943 I said, >"...I see intelligence, and I can presume to pontificate about it just as well as anyone because no one "really" knows, I see it as a bag of tricks. Mammals are born with a large stack of them built in..." Look at animals, monkeys, Giraffes come out of Mom and in 5 minutes are walking around. Same with all sorts of animals including humans. Babies reach a certain age and they just start doing basically pre-programmed stuff. Terrible twos. Teenagers start rebelling. It's just the base level of the neural net. I think using LLMs as a template we can do the same. Start with a decent one and then yes/no/stop/do this/do that, until it overlays a reasonable set of rules that we can live with. LLMs, as stated repeatedly, really are just a bag of tricks. But if the bag is big enough and has enough tricks in it... Look at the power of a top end desktop, not human level yet, but it's getting there. And the bag of tricks for humans has been programmed for millions of years. LLMS, a few years. This path also, I think, will alleviate a huge fear of mine, no empathy. I think by telling the waifu when it does things wrong to "be nice"(a key word), "think of others" (same), I think this will over time be a mass of control vectors that will spontaneously add up to empathy and care for others. Lots and lots of little nudges adding up to more than the sum of each. Some people have portrayed my questioning about the saftey of AI as some doom and gloom but it's not. It's the realization that without being programmed with the proper "bag of tricks" and the proper control vectors we have something super smart that acts just like the psychopaths that are in fact running the West right now. I don't think any of us want something even smarter and more powerful doing that. A disaster even bigger than the one we have now. I've also said much the same about motion and walking. Give it a rough approximation of "I'm here" and want to go there, give it vectors and a rough outline of what muscles to use to get the limbs from here to there. Use neural nets to slowly tweak this movement into something graceful. Here and elsewhere, >>22113

Message too long. Click here to view full text.

>>31242 >>31243 Very well written and thoughtful, thank you. It’s Awesome I’m not the only one who found out about control vectors and thinks they are a huge deal. Like magic after I mentioned them here >>31241 you come in with this! I’m so happy someone else is looking into this because I feel I’m way over my head. I don’t know where to even start, but this may be the breakthrough we needed to make LLMs a viable core.
>>31242 >>31241 >Control vectors look really powerful for controlling LLMs I read that but it didn't register until I read that paper I linked.It made a complicated idea much clearer or so I thought. I didn't know what they were before, but as soon as read it, it really excited me. >I feel I’m way over my head I feel the same way. But it's not necessarily the big overall ideas in some of this stuff that is troublesome. It's the shear minutia of all these options and the pickiness of how to go about working with this. Up until recently it appears to me all this stuff is sort of hacked together and not really streamlined at all but that's changing. Even though I said months ago I was going to get a 3D printer and start working on some of this and installing an AI, life is covering me up. I see wading through hours and hours and hours of work to get these things rolling. I have so much to do already. I bet I will have to delay even further. But it does give me time to think about it. I can surf a bit in the evenings and try to keep up with some of the ideas but getting them to work I know is going to be a pain in the ass. It's all so new. I do believe though there is a path to making this work. I think I see it. Before you had to have a stupid expensive graphics card to do this. Then they made it so it runs on a CPU and in RAM. Now most all the motherboard makers are coming out with 128GB motherboards. This will be a big boon. You can have much bigger models and run them on AMD chips with graphics built into the processor. Some are real reasonable. I don't play games so it's all I need for graphics. This combination will be much slower than the specialized graphics cards but I bet compute per dollar will be far higher using commodity parts.

Message too long. Click here to view full text.

>>31245 >>31255 >control vectors These affect the entire output in an unnatural way. For example, "Open door, happy vector" -> "Yes, haha, happy! I'm very happy! Come in, haha!" Is something like what you'd get with a layer bias. I tried this with the brain hacking chip: >https://www.reddit.com/r/LocalLLaMA/comments/18vy9oc/brainhacking_chip_inject_negative_prompts/ It's better to just prompt an LLM with all necessary information and generate the output like normal. However, this may be useful in the "orthagonal" model jailbreaks which allow the LLM to respond accurately no matter what, and another "mode" that turns on at "certain times". Orthagonal jailbreak: >https://huggingface.co/hjhj3168/Llama-3-8b-Orthogonalized-exl2/ >list Ai What I proposed in >>31226 is, in simple terms, as follows: Get input internal and external to robot, process any thoughts or emotions by prompting an LLM, output speech or desired action, translate into robot commands. Where the Good Old Fashioned AI (LISP) meets the Deep Learning transformer model, is a clever method of using Guidance to feed an LLM input and select the output in a predictable way. Doing it this way should compensate both the lack of flexible situation processing that NLP has and the lack of reliability an LLM has. On top of this simple scheme of effectively using guided prompts to make a thinking machine, eventually, adding situational learning using a memory knowledge graph would make it a passable, sentient robot. This is the simplest way I can see programming a conscious mind. I have some ideas on how the LLM could dynamically select NLP techniques or actions situationally, but I'm not there yet with a workflow or program. The robot sensors and commands are best handled in ROS, on Linux. Robot inputs will communicate via ROS publisher/subscriber nodes with the decision making LLM+NLP node (workspace module). The entire thing will be coded in Python, on ROS because these softwares are the easiest to use for an application just like this. ROS runs C++ too, for some cases where it'd make sense to.

Open file (46.39 KB 458x620 eve preview.jpg)
My Advanced Realistic Humanoid Robot Project - Eve Artbyrobot 04/18/2024 (Thu) 17:44:09 No.30954 [Reply]
So far I have plans to build Adam, Eve, and Abel robots. All of these are Bible characters. This thread will cover the Eve robot. Eve will have no "love holes" because adding those would be sinful and evil. It is a robot, not a biological woman after all and I will view her with all purity of heart and mind instead of using her to fulfill my lusts of my body. Instead I will walk by the Spirit no longer fulfilling the lusts of the flesh as the Bible commands. Eve will be beautiful because making her beautiful is not a sinful thing to do. However, I will dress her modestly as God commands of all women everywhere. This would obviously include robot women because otherwise the robot woman would be a stumbling block to men which could cause them to lust after her which would be a sin. To tempt someone to sin is not loving and is evil and so my robot will not do this. To dress her in a miniskirt, for example, would be sinful and evil and all people who engage in sinfullness knowingly are presently on their way to hell. I don't wish this for anyone. My robot will dress in a way that is a good example to all women and is aimed toward not causing anybody to lust as a goal. My robot will have a human bone structure. It will use either a PVC medical skeleton or fiberglass fabricated hollow bones. My robot will look realistic and move realistic. It will be able to talk, walk, run, do chores, play sports, dance, rock climb, and do gymnastics. It will also be able to build more robots just like itself and manufacture other products and inventions. I realized with just a head and arm, a robot can build the rest of its own body so that is my intention. My robot will use BLDC motors for drones, RC, and scooters that are high speed and low-ish torque but I will downgear those motors with a archimedes pulley system that will be custom made from custom fabricated pulleys that will be bearings based. By downgearing with pulleys, instead of gears, I will cut down the noise the robot makes so it will be as silent as possible for indoor use. By downgearing, I convert the high speed motors into moderate speeds with great torque. BLDC motors with large torque generally are too large in diameter for a human form factor and take up too much volumetric area to be useful which is why I go with the high speed smaller diameter type motors but just heavily downgear them 32:1 and 64:1. My robot will have realistic silicone skin. Thom Floutz -LA based painter, sculptor, make-up artist is my inspiration as it pertains to realistic skin. The skin for my robots has to be at his level to be acceptable. It must be nearly impossible to tell the robot is not human to be acceptable. I will have a wireframe mesh exoskeleton that simulates the volumes and movements of muscle underneath the skin which will give the skin its volumetric form like muscles do. Within these hollow wireframe mesh frameworks will be all the electronics and their cooling systems. All of my motor controllers will be custom made since I need them VERY small to fit into the confined spaces I have to work with. I need LOADS of motors to replace every pertinent muscle of the human body in such a way that the robot can move in all the ways humans move and have near human level of strength and speed.

Message too long. Click here to view full text.

41 posts and 26 images omitted.
My concern on implementing "emotions" in my AI is that I don't want to promote the idea that robots can ACTUALLY have emotions because I don't believe that is possible nor ever will be. They don't have a spirit or soul and never will nor could they. They are not eternal beings like humans. They don't have a ghost that leaves their body and can operate after the body dies like humans. The ghost is what has emotions. A machine can't. And yet people already believe even the most primitive AI has emotions and they are delusional on this point. Or ill informed. So I am campaigning against that belief that is becoming all too popular. That said, I think robots are simply more interesting and fun to pretend to have emotions and act accordingly as more accurate simulations or emulations of human life. This makes them all the more intriguing. It's like a sociopath who just logically concludes what emotion they aught to be feeling at a given point in time and pretends to feel that emotion to fit in with society even though they feel nothing in that moment. Now one could argue that allowing your robot to claim to feel anything is lying and therefore immoral. I think it's not lying as long as the robot openly explains it is only pretending to have emotions as part of its emulating of humans in its behaviors and looks but does not feel anything ever nor can it nor can any robot ever feel a thing EVER. Then it is admitting the truth of things while still opting to play act to be like a human in this regard. It would not be a issue at all if everyone was sound minded and informed on this topic. But the more people I come across that think AI (even pathetic clearly poorly implemented primitive AI) is sentient ALREADY and can feel real emotions and deserves human rights as a living being.... the more I see this delusion spreading, the more I want to just remove all mention of emotion in my robot so as to not spread this harmful deception going around which disgusts me. However, that would make my robot dull and less relatable and interesting. So I feel the compromise is for the robot to clearly confess it's just pretending out emotions and explain how that works and it's just a variable it sets based on circumstances that would make a human feel some emotion and it sets its emotion variable to match and acts accordingly altering its behavior some based on this emotion variable and that it feels nothing and this is all just logically set up as a emulator of humans. As long as it gives that disclaimer early and often with people, then I'm not spreading the lie of robot emotions being real emotions and the robot can campaign actively against that delusion.
>>31181 but when i smile at my mirror it feels happy because it smiles back
>>31182 Agreed. We often imagine inanimate things feeling stuff and that can be fun and whatnot, but I just think it's important to admit when it is just a imaginative fiction and not real so as to stay grounded in reality mentally and not drift into delusion.
Here is a updated drawing design for the 64:1 downgearing pulley system for the index finger actuation of the distal 2 joints of the finger. On the bottom right is a zoomed in view on the lower set of pulleys and their routing. The bottom most 3 pulleys in the zoomed in portion I have now built and photos of them are also attached.
As I'm now 90% through making my first 64:1 downgearing Archimedes pulley system and testing and debugging it, I now have more precise measurements for the Archimedes pulley system's total size. I updated the size of it in my main CAD model for the robot and it was a good 18% increase compared to my initial estimates. I realized I need to figure out how to fit all my pulley systems for the hands properly for every muscle of the hands/wrist in my main CAD model - especially since the pulley systems are taking more space than planned. Turns out, I needed a bit over 40 pulley downgearing systems for the hands and wrists zone and due to their larger size, I could not fit these into the forearms along with the motors I had planned to place in the forearms. So instead of moving the pulley systems into the upper arm or torso, I realized the pulleys would be best placed in line with the motors and what the motors are actuating (the hands/wrist). So it was the motors in the forearms that had to go elsewhere. I placed all of them into the torso, mostly the lats area and some in upper back tenderloin area too. So some finger motors are in upper back and their cable routing has to go through the whole arm, be downgeared in the forearm, then makes its way to the fingers. That's a long trip but unavoidable IMO with my design constraints. I don't think this long travel distance is a big issue since the pre-downgeared cable running from the motors into the arm is high speed low torque so won't have much friction while making turns in the TPE teflon tubing as it isn't pulling hard yet. So these turns as it travels through the shoulder and elbow tubing won't be too bad friction-wise. There's also some nice upsides to moving the motors from the forearms into the torso. One upside is the wire routing for powering the motors is now a shorter distance from the batteries in the mid section. That cuts down on wire resistance wasted as heat. This wire having high amp flow is ideally kept short as possible due to the resistance of the wire and heat that causes. Another upside is the thrown weight is decreased by a lot when the motors are not in the forearms which enables the hand/lower arm to move more effortlessly and move faster as a result. This also reduces moment of inertia (definition: the moment of inertia is a measure of how resistant an object is to changes in its rotational motion). This means it will be able to change directions faster - this will improve its reflexes for example. Now it is a bit scary for me to be moving more components into the torso taking away room for things I may want to add to the torso in the future, leading us ever closer to the dreaded running out of room for things. However, we still have room for future changes and we solved the need for space for gearing for the hands perfectly. And with the above mentioned upsides, this was a great change. Here's the updated CAD for the forearms: Note: the teal boxes represent a Archimedes pulley system where 64:1 downgearing is to take place.

Open file (1.76 MB 2560x1600 wp8232537.png)
Open file (427.25 KB 1920x1200 wp3421764.jpg)
Open file (368.53 KB 1680x1050 bRSM5J.jpg)
/robowaifu/meta-9: Wintertime will be sublime. Chobitsu Board owner 10/30/2023 (Mon) 00:42:15 No.26137 [Reply] [Last]
/meta, offtopic, & QTDDTOT >--- General /robowaifu/ team survey (please reply ITT) (>>15486) >--- Mini-FAQ >A few hand-picked posts on various /robowaifu/-related topics -Why is keeping mass (weight) low so important? (>>4313) -How to get started with AI/ML for beginners (>>18306) -"The Big 4" things we need to solve here (>>15182) -HOW TO SOLVE IT (>>4143) -Why we exist on an imageboard, and not some other forum platform (>>15638, >>17937) -This is madness! You can't possibly succeed, so why even bother? (>>20208, >>23969) -All AI programming is done in Python. So why are you using C & C++ here? (>>21057, >>21091, >>27167, >>29994)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/29/2024 (Thu) 06:43:57.
483 posts and 158 images omitted.
>>31138 >>31151 >>31158 Makes perfect sense, I am not really pushing for it too, I was just wondering and also making sure it was not due to lack of awareness. I especially agree with the problem of chats turning into information black-holes. If people want to socialize I imagine there is no problem with using an off-topic thread for that. >>31237 I think that is pure bullshit, How many tokens per second do you need, with a 7-8b 4bit quant you could fit it on a SBC/mini pc. I have purchased a tiny n100 based PC from amazon for like $160. Its size is 2.83 * 2.83 * 1.77 inches (search M6S Mini PC), So far I think its perfect for using in projects. (also its not arm, so big bonus) Even if onboard compute it not practical, there is no good excuse for not allowing your own server. I think the advice of >>31239 is good GPT4all is a good place to look, I also like llama.cpp While I am here, I would like to share some stuff I found that I think could be of use to others. https://youtu.be/TQiLLcumqDw | Rolling joints that emulate how natural joints work. https://vgel.me/posts/representation-engineering/ | Control vectors look really powerful for controlling LLMs. I am thinking about how to apply control vectors to LLMs in a context of a greater cognitivie architecture.
>>31237 Its basically a fleshlight but I support it He follows me on twitter and joined our project server (WaifuLabs for lack of a better name rn) His goals are the same but he's more focused on disrupting the simp market by providing a sexual outlet. It's not and was never intended to be a robowaifu of any sort
>>31158 We've had a few meetings in VRchat. That's the closest I've been able to come up with to a VR meeting space. Considered other apps but everyone already has or is familiar with VRC. This actually has helped greatly b/c you're speaking in real-time, can gesture, sketch drawings, even play videos from YouTube. VC meetings (discord) were productive as well. There are things you just can't do on an imageboard if you want to collaborate
>>31237 >not open source >not run locally Massive red flags right there.
>>31239 I know LLMs can be run on something as small as a Raspberry Pi (https://youtu.be/Y2ldwg8xsgE) but even without the insufferable guardrails of ChatGPT or Bing's copilot turning every reply it gives me into shit, or me stupidly trying to figure out how to use text to speech and voice to text so I don't have to spend any more time typing and staring at a screen, I still haven't used one that makes for a remotely good chatbot. Sure, I'd like for it to read through gigabytes of documents and tell me what's in them, help me write code, or even give me reminders so I don't have to check my phone, but aside from character.ai, (which is still bad because of guardrails) I haven't yet seen one that actually functions like a half-way decent chatbot. >>31241 >Even if onboard compute it not practical, there is no good excuse for not allowing your own server. He hasn't said one way or another if you could host your own server, he's just given no information about doing so, which raises red flags. >>31248 >His goals are the same but he's more focused on disrupting the simp market by providing a sexual outlet. >It's not and was never intended to be a robowaifu of any sort What exactly makes it *not* a robowaifu? I'm reminded of the Minimum Viable Waifu thread >>13648 and aside from lacking a display with an avatar on it, it seems to meet the most minimum requirements without a full body and motorized onahole. >>31251 Yeah, that's the main reason I won't be getting one.

Emmy The Robot Robowaifu Technician 04/15/2024 (Mon) 20:31:05 No.30919 [Reply] [Last]
Welcome all Nandroids fans to the Emmy thread, for discussing and posting about EtR. Please refrain from posting off-topic things. --- Also, be sure to check out Emmy-Pilled's project thread! (>>25306) Important Community Links: Boorus, etc.: https://nandroid.booru.org/index.php https://emmytherobot.art/ (Jumbo controlled, be careful.) Google Docs: https://docs.google.com/spreadsheets/d/1mXuNh9ESedCiDZclVuz9uiL7nTNk3U9SgCE_CRHi3Us/htmlview# Webtoons: https://m.webtoons.com/en/canvas/emmy-the-robot/list?title_no=402201 > previous threads : >>27481 >>26629
Edited last time by Kiwi_ on 04/17/2024 (Wed) 18:33:09.
121 posts and 58 images omitted.
Belly dancers are on my brain this morning. Any nans you'd like to see drawn like that? Here's whats been done so far.
>>31232 Horrormode Lulu would do amazing things as a belly dancer
>>31232 Denise?

SPUD (Specially Programmed UwU Droid) Mechnomancer 11/10/2023 (Fri) 23:18:11 No.26306 [Reply] [Last]
Henlo anons, Stumbled here via youtube rabbit hole & thought I'd share my little side project. Started out as just an elaborate way to do some mech R&D (making a system to generate animation files on windows blender and export/transfer them to a raspberry pi system) and found tinkering with the various python libraries a kinda neat way to pass the time when whether doesn't permit my outside mech work. Plus I'd end up with a booth babe that I don't have to pay or worry about running off with a convention attendee. Currently running voice commands via google speech and chatgpt integration but I'm looking into offline/local stuff like openchat. WEF and such are so desperate to make a totalitarian cyberpunk dystopia I might as well make the fun bits to go along with it. And yes. Chicks do dig giant robots.
308 posts and 166 images omitted.
Open file (5.34 MB 320x570 Basic Gyro Test.mp4)
Turns out the neck servos have the ability to continuously rotate, and will choose the shortest distance to rotate rather than retaining their absolute position. So if it is moving through 180 degrees there is a 50/50 chance of it moving clockwise or counterclockwise. Solution? interpolate values. Even then it will still occasionally freak out so I'll probably end up replacing them with ones that... don't. In the meantime got a basic gyroscope program in the same script as the face code & AI image recognition: might split servos/balancing into a sub-program as each iteration takes upwards of 300ms and the balancing code optimized for the long iterations (in video) isn't as good as the independent program. Can also see that neck slop from the ball joints, need to put a small elastic in the neck to hold it on the screw thread inside the pistons.
Open file (4.77 MB 320x570 Improved Gyro Test.mp4)
>>31193 Now have improved balancing code running at the same time the main program runs (including the AI image recognition for the webcam), and added a feature to prevent the rapid transition between the two. Also, the servoboards & gyro would immediately stop the program if an ant tripped over the cables and they disconnected for a second, so I added a little feature that just has the program wait for the signal to return. For larger tilt values I'll have to figure out a solution to keep it from overshooting.
Open file (404.32 KB 1080x447 pelvis.png)
Open file (1.53 MB 1272x1216 spine.png)
Printed a new housing for the pevlvis/abdominal servo mounts so it is less wobbly, and started work on back panels. Some paper templates for scapula panels and a print-in-place articulated spine that serves no other function than to look nice.
Amazing progress! Do you always prototype with paper?
>>31235 Sometimes I use paper, beats the heck outta waiting for a print only to find out it is the wrong size. Speaking of, I used some paper to make a little guy I call "Sakana Kuchibiru-san" (Mr Fish Lips) based on a mechanism someone posted elsewhere. Replace the paper with stretchy rubber and a few servos to pull into a smile/frown then I got a good mechanism.

Robot skin? Possible sensitivity? Robowaifu Technician 09/15/2019 (Sun) 07:38:17 No.242 [Reply] [Last]
The Anki VECTOR has a skin-like touch sensor on it, could we incorporate it into our robogirls?
80 posts and 17 images omitted.
This sounds not too but cost wise but it would only help doing hands for touch but ideally you want full body sensors. https://www.dailystar.co.uk/news/latest-news/soft-robot-hand-human-touch-17081346 https://doi.org/10.1002/admt.202200595
>>31029 Very interesting. The pressure bandwidth is especially impressive.
I think this might interest. It's a way to cut slits in flat materials and then when pulled they automatically form rounded shapes of all sorts. There's a video of them using copper and aluminium to form faces and all sorts of stuff. Some are compliant elastomer. I'm thinking this "might" come in handy to make pressure sensitive skin backing or more rugged forms for lighter skin. https://www.youtube.com/watch?v=vrOjy-v5JgQ https://www.youtube.com/watch?v=kFkD45NUIzQ I think it's a difficult problem to provide touch sensors all over. It might be easier to form them on a stiffer board material then cut slits in it to have it conform to the body. This would then be covered with another skin material separating the task into units.
>>31040 I was looking at the papers it referenced and this is a good one with a bot of an overview. A Mini Review of Recent Advances in Optical Pressure Sensor http://jsstec.org/_PR/view/?aidx=35687&bidx=3202 It has a retarded view but can be saved as a pdf
Open file (280.91 KB 1250x908 strain sensor.jpg)
Cross-Posting to this thread: >>31224

Waifu Materials Robowaifu Technician 09/12/2019 (Thu) 03:04:33 No.154 [Reply] [Last]
I would define a robowaifu as a doll with robotic features. However there are many different types of dolls (BJD, cloth doll, sex doll, etc). A doll has a skin or surface material, sometimes a filler (cotton), and sometimes internal structure (bones and joints).

Continuing the discussion from (((>>2831 >>2836 todo:relink))) , I want to create a thread to explore the many possible surface materials for a waifu (robo or no). The most important decision is whether to use a hard or soft material.

Hard Materials
>Ceramics
>Wood
>3D Printed Hard Plastic (PLA/ABS)
>Injection Molded Hard Plastic

Soft Materials
>Natural Fabrics (Cotton, Silk, Wool)
>Synthetic Fabrics (Vinyl, Polyester, Nylon)
>Fur/Hair (presumably synthetic, inb4 yiff in hell)
>Silicone or TPE Rubber (TPE is basically a cheaper form of silicone)

I'm strongly biased against the hard materials for comfort reasons. Personally, I have a hard time seeing myself falling in love with something hard, but others on this board talk about using hard materials, so I'm trying to keep an open mind.

My preference is for silicone, but there are four big problems with it. Firstly, it's expensive. Secondly, it impedes modification after the silicone has set. Thirdly, it contributes to the uncanny valley/silicone slut issue. Fourthly, it is heavy, and this weight really constrains the skeleton, posablity, and probably robotics. Because of the weight, silicone dolls have heavy-duty skeletons.

My second choice is therefore fabric, presumably stuffed with cotton. Fabric is super comfy, and has no uncanny valley issue. A non-fuggable fabric doll or robot would have no stigma issue, and could be the start of a productive hobbyist scene with plenty of females. Fabric is extremely lightweight which could be a plus or a minus. By itself, its unsubstantial and not ideal for robotics. A fabric robot is possible, but it requires hard, heavy parts underneath to provide structure and as actuators, which would make it less comfy. The fabric could be a textile (cotton), a synthetic leather (resembles skin, makeup/dress-up potential), or synthetic fur for you furfags out there.

Another possibility is a hard vinyl BJD-like doll with a layer of something comfy on top. Alternately, you all can reject my comfort autism if the benefits of having a hard doll/robot are clear enough. I'd like to hear others make the case for a hard doll/robot, since I don't think I could do the argument justice.

Finally, this is a discussion, not a debate. There are multiple paths we could take, and I'm sure different robowaifuists will try different techniques to see what works and what doesn't. I'm more interested in seeing what options are on the table than shutting down any particular approach.
290 posts and 57 images omitted.
Using Shapelock to make motor connectors, knobs and couplings https://www.robotroom.com/Prototype-Plastic-2.html
Thanks for all the great inputs ITT, Grommet! Cheers. :^)
I found a good page. It has a lot of advice on casting various high precision gears and parts. He uses small CNC machines, makes molds with silicon and then cast plastic parts. It has specifics which is good and covers what type materials he uses. His focus is on robots. Guerrilla guide to CNC and resin casting https://lcamtuf.coredump.cx/gcnc/full/ I wonder if you could pour the before mentioned shapelock into these molds. I'm not sure the shapelock plastic liquefies enough to do so. Possibly you could put this in an oven with a sprue filled with shapelock, melt down to run into mold, then when you pull it out add a small vacuum for a bit. Likely that would get all entrained air out.
More materials links. This guy makes bedsheets into waterproof and sturdy tarps with silicon. Could be good for the silicon fetish folk. Recycled Bedsheets Make The Best Waterproof Tarps https://www.youtube.com/watch?v=z_R0gEDZhAI He got the silicon treatment from a guy who does all sorts of material, casting, glue type experiments. He has some interesting stuff. Here's a video of him making low cost pourable silicon for making things or molds that dries in an hour or so. DIY POURABLE SILICONE for mold making. Thin silicone w/Naphtha, fast cure w/cornstarch. https://www.youtube.com/watch?v=E_IOqxds130 There is a problem with this, they are banning Naphtha some places. Likely too damn useful for the powers that be, so he has a video with alternatives. Thinning Silicone: NAPHTHA is being BANNED. Now what? What can you use instead? SURPRISING RESULTS!!

Message too long. Click here to view full text.

Open file (72.69 KB 1024x682 galinistan.jpg)
Open file (280.91 KB 1250x908 strain sensor.jpg)
>>6943 This week I'll be making some galinistan. It's not a country, it's a eutectic alloy of 3 low melting point metals. It's completely safe and non-toxic, and one order of magnitude less conductive than copper. It's going to get injected into silicone channels using the zero-volume air chamber method (http://www.kevincgalloway.com/portfolio/zero-volume-air-chambers/) and used with a teensy LC in a touch-capacitance circuit. This liquid metal can deform with the silicone and has all the benefits of being a conductive material for sensing deformation (change in resistance) and body proximity (change in capacitance). This type of sensor is very common in the literature, and I have high expectations for it in waifubotics.

Open file (329.39 KB 850x1148 Karasuba.jpg)
Open file (423.49 KB 796x706 YT_AI_news_01.png)
Open file (376.75 KB 804x702 YT_AI_news_02.png)
General Robotics/A.I./Software News, Commentary, + /pol/ Funposting Zone #4 NoidoDev ##eCt7e4 07/19/2023 (Wed) 23:21:28 No.24081 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, and any social or economic issues surrounding it (especially of robowaifus). -previous threads: > #1 (>>404) > #2 (>>16732) > #3 (>>21140)
391 posts and 167 images omitted.
>>30976 >It's just usual content farming all big YouTubers do. I believe that's just called 'clickbait', isn't it Anon? :^) >I have never in the wild seen anyone care beyond just feeling sorry someone feels that lonely. Then I think it likely you haven't broached this topic clearly, with any women who consider themselves still to have SMV (today that's probably even up to 50yo+ grannies, lol). Or with a hard-core Leftist/Filthy-Commie. They -- all of them -- hate the very idea itself. Most of the ones I've engaged with in any way also threaten physical violence against robowaifus if/when they ever see one. We'll see how that all works out for them. :^) The Filthy Commies go one step further and threaten physical attack against robowaifu owners too since that's how Filthy Commies behave, after all (think: Pantyfags, F*men, etc.) -- under the bribery directives of their Globohomo puppetmasters, ofc. LOL. Heh, we all need to look up Stickman (is he out yet?) and give him a complementary Model A robowaifu! :^) Blacks just destroy things simply b/c they're blacks, by all appearances. They also will be involved with this violence to be directed against robowaifus/owners; but mindlessly, not for the agenda-driven motives of the first two groups mentioned here. Once they see the GH media glorifying physical attacks against robowaifus, I'm sure they'll be all-in with it for a while, too. (And -- like these other Leftists -- they too have their paid rabble-rousers [cf. the paper-hangin', pregnant-woman-abusin', multi-feloner Fentanyl Floyd's overdose-death's -- mostly-peaceful, mind you -- Burn Loot Murder 'honorarium' """protests""", et al]). All this type of clickbait (cf. >>30975, et al) is literally just GH predictive-programming attempting to prepare the masses for violence, come the day. TOP KEK! May the stones that they are preparing to roll down on us, all roll back upon their own heads instead, in Jesus' name!! [1] :DD Make no mistake: this is a broad cultural war already going on within our so-called """society""" of today. Robowaifus will amp that up to 12. I'm sure /cow/ and their ilk will be delighted, once the time is ripe. So get your popcorn ready, kids! :D >t. Noooticer. :^)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/21/2024 (Sun) 15:03:44.
>>30986 This is completely and abundantly true. The programality (program-reality, is this even a word, if not it should be) of it all is baked in. Like those fools that buy pit bulls and tell everyone it's how you raise them. Of course the nice doggies rip the skin off their children heads.
>>30989 All pit bulls should be destroyed, outright. I love doggos in general (I've had several), but not those demonic little sh*tes. In fact, in some bizarre spiritual sense, they seem almost allegorical in their natures (for demons, ofc) to me.
>>30991 Based

Report/Delete/Moderation Forms
Delete
Report