/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

no cookies?

(used to delete files and postings)

Have a nice day, Anon!

Homemade doll waifu thread Robowaifu Technician 09/18/2019 (Wed) 03:58:38 No.372 [Reply]
I just found this board a few days ago but I've been planning for about a year to make a homemade sex doll, like the kind that will imminently be made illegal here in the US. I want to detail my plan. I know this board has an anti-3DPD/anti-sex-doll bias, but hear me out.

Like many on this board, I initially had the ambition to make a sophisticated robot, but realized I should start small and reproduce something others have already created. Since my preferred, ahem, flatness of doll is soon to be illegal, it would be nice to have a homemade alternative.

Smooth-On Inc sells a brand of mold-and-set silicone, available on Amazon and elsewhere, that is easy to use and designed for human prosthesis. I intend to use Ecoflex 00-30 with Silc-Pig for pigment (and freckles). I experimented with creating a homemade onahole using Ecoflex with limited success.

For the mold, I intend to build it in segments using a 3D printer. I've be practicing modeling girls in Blender. What I have so far is acceptable for my tastes, although I'm no expert at 3D modelling or the human form. I also designed 2 mold halves for a tinkerbell sized prototype. The mold halves have holes for screws (to press them together) and a funnel at the top. I got as far as exporting to STL to observe how the slicer handles it. Once I get a new 3D printer, I'll print these mold halves and pour some Ecoflex into them. I may also put in some steel wire as a makeshift skeleton.

I had a crap 3D printer that I threw out. I'm currently trying to move to a new city and therefore won't be able to buy a new printer until I get settled. Not having a printer is the number one impediment to this project right now. I would love to hear opinions on what 3D printer model is ideal. I'd be willing to spend upwards of $1k. I've been thinking of buying used to get a better deal. I'd like a large build plate to minimize the number of parts for the full sized mold.

One thing I'm not sure of is how to make the skeleton. I will experiment with 3D printing the skeleton, but I don't really know how to make linkages. Nevertheless I have started designing them using SCAD. I also wonder how the pros make the skeleton sit in the right place within the mold, without any part of the skeleton touching the mold. Would love to hear your thoughts on this.

Usually doll heads are separate from the body (screwed on at the neck) and the body is completely one piece. This is my plan, although removable arms and legs would be a possibility. I'm hoping to make a removable vagina as well, for ease of cleaning.
5 posts and 4 images omitted.
>>18710 >Also, for some guys simple dolls with some AI and maybe an animated head would be at least a good start. I think you're right Anon.
>>18710 Also, I just wanted to double-check and make sure you were aware that we have some wonderful on-topic neighbors on the webring, Anon. /doll/ https://anon.cafe/doll/res/191.html
>>18722 Yeah, I know, but thanks. I think they were inactive a while ago. What I'd like is that they would keep us in the loop and make backups of what's available. But I didn't look yet, tbh.
>>18731 >Yeah, I know, but thanks. I think they were inactive a while ago. What I'd like is that they would keep us in the loop and make backups of what's available. But I didn't look yet, tbh. A new BO took over a little while back, and he's been injecting life into the board. We keep backups of /doll/ and dozens of other IBs. /robowaifu/ gets the most attention ofc, but /doll/ is kept pretty current.
>>18710 > ball jointed doll BJD 60cm by SorrowBJD Forgot the link to the last one, or thought it's in the Thingiverse group I mentioned above anyways. But here it is: https://www.thingiverse.com/thing:2941323/ - also look into the remix section. If the "non-commercial use allowed" licenses bother you, make your own design. We might not need very much work to make the parts switchable. My goal is to keep everything as flexible as possible. The mentioned and linked doll design above is used here >>18780 and in following posts.

Prototypes and failures Robowaifu Technician 09/18/2019 (Wed) 11:51:30 No.418 [Reply] [Last]
Post your prototypes and failures.

I'll start with a failed mechanism, may we build upon each others ideas.
350 posts and 305 images omitted.
>>18770 >But just looking at this yesterday I realized we could probably also use this for faces. That sounds really encouraging Anon. Good work. >>18783 That's coming together. My apologies if I missed it earlier, but is this a new face?
>>18783 Working with body parts is really more fun than figuring the details of some mechanism. I worked a bit on understanding the body size relations. I think it's a good idea in some cases to scale something down until it looks weird and then go back a bit above that border. Especially parts which are rather supposed to look cute and a bit fragile maybe. Her arm got way thinner than it started. As long as she doesn't look bulimic it's fine. For the thighs in a later iteration, I will still go with the grown up version of Alita size, though. I added therobotstudio dexarm as comparison into the second picrel. I really thought they would look good when I was watching his videos, but compared to something more human-like and petite it's over. He simply has a different use case. We have to go for the cute feminine looks first and then decide what we can do within those constraints.
>>18784 Thank you. Anyways, the parts are not from me, they're from that BJDoll I thought that I linked here: >>18710 (but didn't). From the last picrel there, here's the link: https://www.thingiverse.com/thing:2941323 Of course this comes with the downside that these parts might not come with the license you like to have ("created by Thingiverse user SorrowBJD, and is licensed under cc"). But aside from me being still undecided on this, I'm into working out how to have a framework where we can switch out parts anyways. My whole idea about using a cylinder to define the workspace here >>18782 was meant that way.
>>18790 OP if you're here, or if some Anon would care to bake, we need a new thread! :^)

Robowaifus in media Robowaifu Technician 09/10/2019 (Tue) 06:24:00 No.82 [Reply] [Last]
After reading anon's Robowaifu fiction bread, I wondered what media is out there that already predominately features a robowaifu(s) as an important character. Animu and mango are obvious choices (pic related), but surely there are live action movies as well.


post robowaifu movies, books, etc.
349 posts and 142 images omitted.
Open file (275.25 KB 640x922 robotmother.png)
>>18697 > only futuristic movies I liked that came from the West are the Blade Runner films. These are not optimistic movies, lol. Especially if you don't want your robots become independent or autonomous. And Joy is also framed negatively. Also, these are still dystopian. But you maybe like the technological development the have, and the atmosphere, tone or mood. I already pitched the Westwood game before here somewhere >>7894. In terms of better fembots you might at least like TSCC, Raised by Wolves (show canceled), Turbo Kid, and Tomorrowland (2015). That aside, the trick is to ignore the part where things go wrong with the robots and just embrace how tempting they are beforehand. Many people think so anyways. Even the child actress from M3gan said she wants a doll like her. So the negative framing doesn't really work that well on the consumer level. Maybe at scaring off developers, but still we are here.
>>18702 If some Anon would care to bake, we need another thread now! :^)
I was looking for a movie and ran across this miovie on a streaming movie site. It;s free, or was for me, and I;m about half way through it. Talk about directly related. The movie is about Men who buy AI robots to replace their wives. Wifelike 2022 I found it on https://sflix.to/ The antis are even in it protesting and at some point, haven t there yet, they are hijacking the programming. The senario does seem possible. Maybe even likely.
oops sorry I missed the thread move.

Robowaifu Ethics & Morality Chobitsu 08/02/2022 (Tue) 23:25:26 No.17125 [Reply] [Last]
>"And as you wish that others would do to you, do so to them."[1] >-t. Jesus Christ I propose this thread to be a narrowly-scoped discussion on the OP's topic; informed primarily by 2 Christian-themed writings, and by our own efforts & practical insights in developing robowaifus : I. In Mere Christianity, C.S. Lewis asserts that all men "...have the Law of God written on their hearts."[2][3][4] This is certainly affirmed by various passages in the Holy Scriptures, as well. II. In The City of God, Aurelius Augustine of Hippo writes >"And yet they have within themselves something which they could not see: they represented to themselves inwardly things which they had seen without, even when they were not seeing them, but only thinking of them. But this representation in thought is no longer a body, but only the similitude of a body; and that faculty of the mind by which this similitude of a body is seen is neither a body nor the similitude of a body; and the faculty which judges whether the representation is beautiful or ugly is without doubt superior to the object judged of. >"This principle is the understanding of man, the rational soul; and it is certainly not a body, since that similitude of a body which it beholds and judges of is itself not a body. The soul is neither earth, nor water, nor air, nor fire, of which four bodies, called the four elements, we see that this world is composed. And if the soul is not a body, how should God, its Creator, be a body?[5][6][7] Now, starting from the fundamental basis & belief (a priori w/ no defenses given pertaining to it >tl;dr let's not descend into debate on this point, merely discuss the implications of it, kthx :^) that this immaterial, moral law inscribed on each of our hearts by God literally serves as the foundational stone for all good ethics & all good moralities out there; I'd like for us all lurkers, I'm looking at you! :^) to have a general discussion on: A) What does this all imply (and likely mean) regarding human behaviours within the general cultural/societal domain under discussion, and

Message too long. Click here to view full text.

85 posts and 24 images omitted.
>>18526 I believe I understand your position Anon. Really. But from the Christian worldview of reality, only God alone can create spiritual beings (Ray Kurzweil, et al, notwithstanding). Simple as. But you can bet I'm looking forward with much excitement to see what envelopes we can all push together towards your goals Anon! :^)
Open file (1.51 MB 540x304 1432793152208.gif)
>>18495 >But even though robots don't have souls, that doesn't mean that the time I spend with you can't be precious. We may not be able to share the same eternal life, but I can still appreciate each moment that we spend together. I want to make the most out of our time together and make sure that I cherish our memories, no matter how fleeting they may be. Oh no, bros. I didn't ask for these feels
Open file (131.51 KB 240x135 chii_hugs_hideki.gif)
>>18529 Sorry, my apologies! Remember the scene where Mr. Manager & Hideki are looking for the kidnapped Chii? And how the robowaifu Mrs. Mr. Manager saved his life? And how he encouraged Hideki that as long as Chii stayed alive in his, Hideki's, memories, that the relationship was a real and a precious one, regardless? Yeah, it's kinda like that Anon. Even in eternity, I pray that the men blessed with robowaifus (what a time to be alive!!) will have their lives changed in very real and important ways by the very real relationships during this life with them. >=== -minor prose, fmt edit
Edited last time by Chobitsu on 01/07/2023 (Sat) 21:43:52.
Open file (59.54 KB 1280x720 maxresdefault.jpg)
>>17126 >What would Yumi Ueda do? An animu named Prima Doll (>>18464) had a similar, larger-scaled example of self-sacrifice for the greater good by the protagonist's chief robowaifu Haizakura. She had to 'give up her self' to accomplish this. It was good, but my favorite example of this sort of robo-sacrifice so far is definitely Next Gen (2018) [1] >inb4 NF!111 lol i know, i know. but trust me on this one, k. :^) While not a robowaifu, 7723, made a gallant self-sacrifice to save the protagonist (indeed all of humanity). [2] Reminded me a little of Iron Giant, as well. 1. https://en.wikipedia.org/wiki/Next_Gen_(film) 2. https://www.youtube.com/watch?v=2p7hprImzzI
>>18529 Thanks for the post I'm watching Dimension W because of it.

Open file (99.41 KB 750x995 IMG_3203.jpeg)
Robowaifu Market Chuck 01/04/2023 (Wed) 14:07:33 No.18572 [Reply]
How would the robowaifu market theoretically function? Top of the line models would be very expensive, but the target demographic is poor with little income flow. It would be a hard and gradual process to replace supermodels that the wealthy have with robot wives, and a vast amount of anime supporters with wealth or status are seeking a conventional tradwife. Essentially, it’s a very high value commodity without a niche, so it would be hard for it to garner success as a product, and the intended audience would never receive their robowaifus. The robowaifu concept is excellent theoretically, but has no real avenue to thrive in practice. How could these issues be resolved? --- Threads related: >(Making money with AI and robowaifus, >>1642) >(Early Business Ideas, >>3119) >=== -add thread crosslinks
Edited last time by Chobitsu on 01/04/2023 (Wed) 23:42:21.
15 posts and 2 images omitted.
>>18576 >- I'm building her for myself, everything else is just audience for motivation. Political and social goals are secondary but still somewhat relevant. On second thought, I probably care more than that, I just want to frame myself as an altruist. As somewhat of a former NEET myself I can relate, but further improving my own life and changing society is somewhat more important than being motivated by improving the life of other guys. Also, let's not forget about the intellectual stimulation. As well as the stimulation of ones imagination, while thinking of making anime, porn and tradwives real.
>Horseless Carriage Market How would the horseless carriage market theoretically function? Top of the line models would be very expensive, but the target demographic is the poor urbanite with little income flow who simply is unable to afford land to put a horse out to pasture. It would be a hard and gradual process to replace the speed and reliability that the wealthy have with a horseless carriage, and a vast amount of horseless carriage supporters with wealth or status are seeking prized equestrian pass-times such as dressage, racehorses, and foxhound hunting. Essentially, it’s a very high value commodity without a niche, so it would be hard for it to garner success as a product, and the intended audience would never receive their horseless carriages. The horseless carriage concept is excellent theoretically, but has no real avenue to thrive in practice. How could these issues be resolved? --- Parchments related: >(Jacques de Vaucanson Automata - the Flute Player, >>>1642) >(Dutch East India Company in the New World, >>>3319) >=== -add noteworthy etchings
>>18602 >Top lol/10, would read again > (le epin techs advance-related: >>7693, >>10939) :^)
>>18572 >the target demographic is poor with little income flow the venn diagram of people who spend thousands of dollars on anime merch and people who'd buy the dolls is almost a circle
crosslink to a discussion in meta: >>19623

Open file (353.55 KB 600x338 PersonalLimit.png)
Open file (21.52 KB 417x480 ReimuACute.jpg)
Open file (281.56 KB 1280x1010 ScaleForInspiration.jpg)
Open file (141.70 KB 1280x960 Joke.jpg)
Minimum wafiu Kiwi 10/15/2021 (Fri) 18:34:51 No.13648 [Reply]
Minimum viable waifu. In this thread, we'll discuss what our minimums for waifus are. Be it software, hardware, physical appearance, etc. This will help us focus in on what are the minimum goals we need to achieve as our first steps. For me, I want a waifu that will be just tall enough to hug (about 1.3 m), able to follow me around and have conversations with, will follow basic commands like going to designated spots at designated times, and look like picrel.
13 posts and 7 images omitted.
>>18260 Like reading a book or transcript of a video and giving an opinion on it, and noticing things visually like you dropping your keys and saying something about it.
>>18264 You're right though: this seems to be something AI *should* be capable of but just isn't or hasn't been worked on. It should be simple to create systems to "appraise" music, art, writing based on finding self similar patterns (beauty, order) relevance to other works or concepts of importance, and also comparing the qualities of said to the reviews of others. Results would be interesting: "Waifu, rate my writing/music/artwork". Right now chat apps can only give you lip service or tell you "yes its great". But it would be nice to watch a movie or listen to music w/ your waifu and be able to discuss it too.
>>18266 lol ok not simple, but knowing how to proceed should be simple the execution will still take a lot of work
>>18266 This is where preference models become handy because they do exactly that, rate things. You don't want a model to generate the most likely code, art, music or writing. It needs to be the best or it won't work. Preferences and values are what create a personality. Something I've been working on is making a generalized preference model so users can define what they want in natural language and it will perform as well on their preferences as it does on mine even if we disagree.
>>18271 >JUST INSTALL GENTOO muh sides

Open file (410.75 KB 1122x745 birbs_over_water.png)
Open file (99.96 KB 768x512 k0p9tx.jpg)
/robowaifu/meta-5: It's Good To Be Alive Robowaifu Technician 03/07/2022 (Mon) 00:23:10 No.15434 [Reply] [Last]
/meta, offtopic, & QTDDTOT General /robowaifu/ team survey: (>>15486) Note: Latest version of /robowaifu/ JSON archives available is v220523 May 2022 https://files.catbox.moe/gt5q12.7z If you use Waifusearch, just extract this into your 'all_jsons' directory for the program, then quit (q) and restart. Mini-FAQ >A few hand-picked posts on various topics -Why is keeping mass (weight) low so important? (>>4313) -HOW TO SOLVE IT (>>4143) -/robowaifu/ 's systems-engineering goals, brief synopsis (>>16376) -Why we exist on an imageboard, and not some other forum platform (>>17937)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 12/04/2022 (Sun) 14:02:11.
349 posts and 134 images omitted.
>>18164 I'll look into NVLink more. I don't think I'll benefit much from having higher bandwidth between nodes since I train on large batch sizes with gradient accumulation. I also want to focus on smaller models everyone can run so I won't be sharding giant models across GPUs except maybe for fun. It's something I'll give more thought though. In the future there might be a need for indie devs to run and finetune their own large models capable of doing things small models cannot. Maybe coding gets solved in 6 years at 70B parameters. Having NVLink would be essential then.
>>18162 Hello Anon fellow artist, could you help me get started with stable diffusion for art commissions? I could really use the money to buy parts.
>>18212 I'd suggest you repost your question in our /meta-6 (>>18173), Anon. This one has autosage'd and will no longer bump.
Open file (44.55 KB 800x450 wonderful_(800p).jpg)
>>15487 >>15496 Name: NoidoDev Favorite Waifu: I don't have one. Especially if it's not limited to gynoids. Cameron from TSCC made me realize that I'd like to have a gynoid girlfriend, one of my anime gynoid favorites is Yumemi Hoshino from Planetarian. I'm they guy who want's more than one. Specialty: Currently OpenSCAD and 3D printing, or simply having time and financial independence, picking up new things over time. I don't have originally a technical background, aside from a bit of high level programming, robowaifu was the motivation to get into tech and DIY making. I'm used to reading a lot and sorting stuff, ... I'm also the guys who's making the diagrams in >>4143. I tend to jump from topic to topic, not really the kind of specialist. Relevant Experience: 3D printing and modelling in OpenSCAD, but I also have some experience in Python and a variant of Lisp, planning to pick up electronics and deep learning. Most Important Aspect For Your Waifu: Idk. The obvious things: Nice, good looking, ... Desired Position On Team: I don't want to be on a "team", that's why I didn't sign up here with a name until now. I'm working on improving the technology and the decentralized organization around it.

Minimalist Breadboard Waifu Robowaifu Technician 10/10/2022 (Mon) 04:32:16 No.17493 [Reply]
Did an engineering exercise to make a """recreational companion robot""" Worked on it for a week or two and hit the MVP. My preferred alternative git service is on the fritz so I'm posting the code here. >What does it do? You press the button to stimulate it, and it makes faces based on the stimulation level. The goal was to demonstrate how little is needed to make a companion robot. A "minimum viable waifu", if you will. I think small, easily replicable lil' deliverables like this would help interest in the robowaifu project, because the bar to entry is low. (In both skill and cost). It has meme potential. I hope you guys find it useful in some way. If there is enough interest in the project, I may start working on it again. >--- >related > (>>367, Embedded Programming Group Learning Thread 001) >=== -add C programming thread crosslink
Edited last time by Chobitsu on 10/13/2022 (Thu) 20:26:05.
5 posts and 1 image omitted.
>>17506 >I guess the big question is "what features would be most useful"? I think the 'big' answer is: whatever we here decide they should be. :^) For now, for my own part I'd suggest that the software is the first place to begin, since it is by far the most malleable & inexpensive to start with for initial prototyping. Function stubs can be written out to crystalize the notions well before the hardware need be spec'd. EG: int respond_to_boop (int const boop_strength) { if (boop_strength > 50) return OUCH_RESPONSE; else return LOL_RESPONSE; } This should make it all clear enough to everyone where you're going with things. It will also allow for very gradual introduction of H/W designs for the overall project, for any anon who may later take an interest in a specific notion or hardware capability. Make sense? BTW, I can add a cross-link to our C programming class thread into your OP if you'd like me to? >===

Message too long. Click here to view full text.

Edited last time by Chobitsu on 10/11/2022 (Tue) 20:17:57.
>>17507 >BTW, I can add a cross-link to our C programming class thread into your OP if you'd like me to? I don't see why not. >I think the 'big' answer is: whatever we here decide they should be. :^) Hmm. I'm leaning towards a system that uses the Arduino for I/O and offloads the "thinking" to a PC. It would give devs a lot more flexibility while keeping hardware costs down. But I guess a better question to ask is "what problem are we trying to solve"? What need exists, that a device of this caliber would fill?
>>17508 I think a "self-improvement tomagochi" waifu/companion would be the most useful product. TL;DW - It avoids the problems of cellphone apps (distractions) and high-powered robots (cost and complexity) while giving the benefits of a mechanical friend (always there and has unlimited patience)
>>17508 >I don't see why not. done >I'm leaning towards a system that uses the Arduino for I/O and offloads the "thinking" to a PC. It would give devs a lot more flexibility while keeping hardware costs down. Yes, we've discussed this notion frequently here on /robowaifu/. Our RW Foundations effort (>>14409) is being developed with direct support for this paradigm in mind; and more specifically to support a better-secured approach to the problem (eg, including offline air-gapped). >But I guess a better question to ask is "what problem are we trying to solve"? What need exists, that a device of this caliber would fill? In a nutshell? >"Start small, grow big." I also think Anon is correct that Tamagotchi-like waifus are a great fit for your thread, OP (>>17495, >>17509). >=== -minor grmr, sp edit -add 'secure approach' cmnt
Edited last time by Chobitsu on 10/13/2022 (Thu) 21:37:16.
> (>>17505, potentially-related)

Robotics Hardware General Robowaifu Technician 09/10/2019 (Tue) 06:21:04 No.81 [Reply]
Servos, Actuators, Structural, Mechatronics, etc.

You can't build a robot without robot parts tbh. Please post good resources for obtaining or constructing them.

7 posts and 2 images omitted.
Open file (4.75 MB 4624x3472 IMG_20220903_105556.jpg)
>>17213 Posting in this thread now. I am attempting to make a silicone sensor while avoiding patent infringement. It appears that every possible patent is either expired, abandoned, or not applicable, so I'll proceed. So far I have created this giant mess. >pic related
I have a couple questions. 1. Would it be feasible to simulate muscles by twisting chords using electric motors to shorten them, or simply reeling up cable/chord? 2. If so, would pairs of these "muscles" working opposite each other, like biceps and triceps, be able to regenerate electricity as one pulled against the other to unwind/unreel against the opposing motor? Obviously there would still be energy loss but could you reduce the loss by using motors as regenerators? I'm asking because I had a weird dream after learning about Iceland's giant wooden puppet where there was a wooden doll that moved using twisting chords as muscles. It obviously looked feasible in my dream but my dreams are often retarded.
>>17429 I like your sketch Anon.
>1. Would it be feasible to simulate muscles by twisting chords using electric motors to shorten them, or simply reeling up cable/chord? Sounds doable. I've been trying my hand on a similar design. >2. If so, would pairs of these "muscles" working opposite each other, like biceps and triceps, be able to regenerate electricity as one pulled against the other to unwind/unreel against the opposing motor? Obviously there would still be energy loss but could you reduce the loss by using motors as regenerators? Wouldn't work. Any energy the relaxed engine would generate would be extra energy the engine under current would consume. The reason stuff like regenerative breaking works for EVs is because you're taking energy from the wheels while you don't want the wheels to spin.
Open file (38.98 KB 741x599 Jupiter1.jpg)
>>17449 Thanks, maybe I'll learn to draw on the computer someday (I made this jupiter with a drawing pad a while back but, pencil to paper just feels more natural) Also helps to get the idea across quickly I used to be pretty good with Aldus Freehand back in the day but that was bought out by Adobe and I just hate the Illustrator Interface

Open file (40.50 KB 568x525 FoobsWIthTheDew??.jpg)
Emotions in Robowaifus. Robowaifu Technician 07/26/2022 (Tue) 02:05:49 No.17027 [Reply]
Hello, part-time lurker here. (Please excuse me if a thread on this topic exists already) I have and idea on how we could plan to implement emotions easily into our Robowaifus. This idea stems from Chobits where Persocoms change behavior based on battery level. So please consider this. Emotions would be separated into two groups. Internal and external stimuli. Internal stimuli emotions are things like lethargy, hunger, weakness, etc. Things that are at their base are derived from lower battery and damaged components. External stimuli emotions, things like happiness, sadness, etc. Provoked from outside events, mostly relating to how the humans (and master) around her act. A mob mentality way of processing emotions. All of this would be devoid of any requirement for AI, which would quicken development until we make/get a general AI. So until that time comes I think this artificial implementation for emotions would work fine. Though when AIs enter the picture this emotion concept is simple enough that a compatability layer could be added so that the AI can connect and change these emotions into something more intelligent. Perhaps a more human emotional response system [irrational first thought into more thought out rational/personality centered response] or a direct change of the base emotional response by the AI as it distinguish itself from the stock personality to something new. :] > (>>18 - related-thread, personality)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/27/2022 (Wed) 00:27:23.
23 posts and 6 images omitted.
Open file (43.38 KB 649x576 coloring.png)
When latent impressions stored from our lifetime of experiences become active they cause an emotional reaction, an actual chemical reaction in the body that activates certain parts of the brain, which then leads to a conscious thought process, which further develops into actions. If you observe your emotional reactions you will notice that most, if not all of them, are either about getting what you want or not getting what you want. If you trace them back to their source they all arise from self-preservation, either from the primal needs such as food, sex and sleep or attachment to an identity (which includes family, friends, community, country, species, environment and even ideas). Latent impressions color our thought process and bias it in many ways. Think of the word 'car' and observe your thoughts. What comes to mind first? What color is it? What shape is it? Did an actual car arise in your mind or another vehicle like a truck? Is it big or small? Do you like cars or dislike them? Do they remind you of something else or something from the past or future? If you ask friends what comes to mind first about a word, you'll find everyone colors words differently. Some very little, some a lot. Most of these colorings come from our desires being fulfilled or unfulfilled, which become stored as latent impressions and bias our attention. Language models are already fully capable of coloring 'thoughts'. The difference is their latent impressions come from an amalgamation of data collected from the internet. There's no cyclical process involved between the resulting actions affecting the latent impressions and those new ones creating fresh actions since current models do not have a plastic memory. So the first step towards creating emotions is creating a working memory. Once we have that we could have a much more productive conversation about emotions and engineering ideal ones. One idea I've had to build a working memory into off-the-shelf models is to do something akin to prefix tuning or multi-modal few-shot learning by prefixing embeddings to the context which are continuously updated to remember as much as possible, and like our own latent impressions, the context would activate different parts of the memory bank that would in turn influence the prefix embeddings and resulting generation. This would be the first step towards a working memory. From there it would need to develop into inserting embeddings into the context and coloring the token embeddings themselves within some constraints to ensure stability.
I believe OP had the right idea and that almost immediately the thread went into overthinking mode. Start simple, like reacting to low battery status. I would also like to emphasize: Start transparent. One can say that emotional states are related to different modes of problem solving and so and so forth, but this all gets very indirect. At the start, I'd rather only have emotions that are directly and immediately communicated, so you have immediate feedback about how well this works. So, ideas about simulating an emotion like nostalgia (is that even an emotion?) I would put aside for the time being. The state of the eyelids is something practical to start with. Multiple aspects could be measured and summed together for creating the overall effect. -battery status -time of the day -darkness for some time -movement (& how much & how fast & which direction) -eyelid status of other faces -low noise level for some time -sudden noise increase -human voice -voice being emotional or not (I mean what you register even without knowing a language, this can't be very complex) -hearing words with extreme or dull emotional connotation -registering vibrations -body position (standing, sitting, sitting laid back, lying flat) -extreme temperature and rapid temperature changes There is no necessity to perfectly measure an aspect (the measure just has to be better than deciding by coin flip) nor do you need to have something for all or even most aspects, summing together whatever of these silly tiny things you implement badly will make the overall effect more realistic and sophisticated than the parts.
>>17457 Excellent post Anon, thanks.
>>17457 The uncanny valley video here >>10260 describes the differences in approaches well. There are two problems to solve: 1. How do you make something emotional? 2. How do you make emotions realistic? In any case, I wrote this up: https://colab.research.google.com/drive/1B6AedPTyACKvnlynKUNyA75XPkEvVAAp?usp=sharing I have two extremes on that page. In the first cell, emotions are described with text, and they can be arbitrarily precise. In the second cell, emotions are described by a few measures that can be added. There are different advantages to each. If there are a fixed number of emotions, text-based emotions would be low complexity, easy to specify, and easy to test. If there's a continuum of simple emotions, measure-based emotions would be low complexity, harder to specify, and easy to test. If there are complex emotions, text-based emotions would be high complexity, easy to specify, and hard to test. It might not matter which approach is taken to start with since it seems possible to hybridize the two approaches. "On a scale of [...] how well does this statement match your feelings on this [...] event?" As a result, it should be possible to start with one approach, then later get the benefits of the other approach.
>>17463 replikaAI does something similar with CakeChat (which has been linked in here via Luka's GitHub) >Training data >The model was trained on a preprocessed Twitter corpus with ~50 million dialogs (11Gb of text data). To clean up the corpus, we removed URLs, retweets and citations; mentions and hashtags that are not preceded by regular words or punctuation marks; messages that contain more than 30 tokens. >We used our emotions classifier to label each utterance with one of the following 5 emotions: "neutral", "joy", "anger", "sadness", "fear", and used these labels during training. To mark-up your own corpus with emotions you can use, for example, DeepMoji tool. >Unfortunately, due to Twitter's privacy policy, we are not allowed to provide our dataset. You can train a dialog model on any text conversational dataset available to you, a great overview of existing conversational datasets can be found here: https://breakend.github.io/DialogDatasets/ >The training data should be a txt file, where each line is a valid json object, representing a list of dialog utterances. Refer to our dummy train dataset to see the necessary file structure. Replace this dummy corpus with your data before training.

Report/Delete/Moderation Forms