/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Have a nice day, Anon!


Open file (1.11 MB 894x950 Sophie_Head.png)
Elfdroid Sophie Thread #3 SophieDev 02/18/2022 (Fri) 11:38:48 No.15236 [Reply] [Last]
New video of Sophie is now up: https://www.youtube.com/watch?v=XOGrdHn7wBU Finally got her head working in a reproducible manner. I had completely broken about seven of her previous micro-servos. But the burnt-out ones weren't the big problem - I just connect suspect servos my Arduino UNO and if my control servo can run the 'Sweep' sketch but the suspect servo cannot, I know it's dead and in the bin it goes. However, one servo was damaged but still working - except it caused some kind of feedback that made all other micro servos connected to the same circuit go haywire - even brand new ones. Luckily the faulty servo in question was old and had spraypaint on it, so I could tell it apart from the others. But that was very confusing - at first I thought it might be related to the small magnets that hold her faceplate on, but this is not the case. Rogue servos are definitely something to watch out for in future. Anyway, now that I have measurable, standardised voltage going into all of the micro servos, I'd like to upgrade her neck again. She can actually shake her head (but not in the above video as it is addressed straight-to-camera), but she still cannot nod her head as it weighs too much and the neck servo overheats rapidly. Heads are relatively heavy things (especially with long hair). The breakthrough with Sophie has been splitting her up into separate subsystems and separate circuits, then focusing on only ONE subsystem/circuit - in my case her head. For a beginner like me, it was just too confusing and labour intensive to attempt programming her head, speech, neck, arms and hands all simultaneously. When errors occured I was having a real hard time pinning down which motors were affected, how badly they were affected and why. Having to tear down one large, complex system is waaaay harder than troubleshooting something far smaller. So focusing only on her head solved this problem.

Message too long. Click here to view full text.

55 posts and 16 images omitted.
>>30892 >Moving into own house Congratulations. >Only keeping electronic control boards, a single robot hand and Sophie's head. I would keep it as memorabilia, it's just one small box. Also, keep in mind some museum might want it one day.
>>30899 >Also, keep in mind some museum might want it one day. This is a really good point, Anon.
>>30892 You could just construct a simple frame and fabric doll body for the head and hand if you wanted more than it just sitting on a shelf in some closet.
Open file (373.11 KB 1100x448 servo_horns.png)
>>30894 >>30899 > Congratulations. Thank you, both. As a home and mortgage owner, I am now officially another normie cog in the machine! Lots of work already done, lots for me still to do in the near future! Speaking of cogs in machines; the main design problem with the Elfdroid arms was my choice of servo horn. The round ones (even aluminum) are no good for a joint that supports a lot of weight and must move back-and-forth repeatedly. Those round servo horns are only held onto the servo output shaft by little gear-teeth and/or a small grub-screw. This will wear a groove into the output shaft afer only a few dozen limb movements. Meaning the servo horn will begin to slip on the shaft, resulting in huge errors of movement in the robot's limb (no matter how well-programmed the servos). If this problem is ignored for long enough, the robot's limb will eventually detach mid-movement, often yanking out wires and bending electronic pins in the process. For any joint under heavy load, I highly recommend using the best R/C car-type aluminum servo horns that you can find. I used a couple of these in Sophie's neck joints and they eliminate the servo output shaft wear/slippage problem entirely. Of course, they are far more expensive, precision-machined components, but that's part-and-parcel of any serious robotics project (as I learned when reading about backlash and harmonic drives). Thought I should leave the main design + build error that I made here at the end, just in case anyone else ever tries building Elfdroid arms or wants to edit the CAD files.
>>30927 Thank you very kindly for that information, SophieDev. Very helpful. I pray that God keeps you safe, and guides you on your further journeys in your life. You've been a huge encouragement and help to all of us here on /robowaifu/ with your development of dear Sophie. Even if you never pick up robowaifu research again in the future, you've already made a big impact in our early developments. Thank you for that. Please don't be a stranger, and stop by occasionally to give us updates on your doings, Anon. Cheers!! :^)

3D printer resources Robowaifu Technician 09/11/2019 (Wed) 01:08:12 No.94 [Reply] [Last]
Cheap and easy 3D printing is vital for a cottage industry making custom robowaifus. Please post good resources on 3D printing.

www.3dprinter.net/
https://archive.is/YdvXj
220 posts and 42 images omitted.
>>27835 >3D PRINTING HACK LEVERAGES VASE MODE STRUCTURALLY Thanks! Follow up: https://youtu.be/-dy-4_L4p9s I also recommend signing up to the channel: https://www.youtube.com/@DreadMakerRoberts Shaving prints instead of sanding: https://youtu.be/TbvFPN7yxt0 3D printing glue: https://youtu.be/zp6ODP8AJmk
Thanks for the nice contribs, NoidoDev. Cheers. :^)
via >30501 >This article demonstrates a two-step method to 3D print double network hydrogels at room temperature with a low-cost ($300) 3D printer. A first network precursor solution was made 3D printable via extrusion from a nozzle by adding a layered silicate to make it shear-thinning. After printing and UV-curing, objects were soaked in a second network precursor solution and UV-cured again to create interpenetrating networks of poly(2-acrylamido-2-methylpropanesulfonate) and polyacrylamide. By varying the ratio of polyacrylamide to cross-linker, the trade-off between stiffness and maximum elongation of the gel can be tuned to yield a compression strength and elastic modulus of 61.9 and 0.44 MPa, respectively, values that are greater than those reported for bovine cartilage. The maximum compressive (93.5 MPa) and tensile (1.4 MPa) strengths of the gel are twice that of previous 3D printed gels, and the gel does not deform after it is soaked in water. By 3D printing a synthetic meniscus from an X-ray computed tomography image of an anatomical model, we demonstrate the potential to customize hydrogel implants based on 3D images of a patient’s anatomy. https://pubs.acs.org/doi/abs/10.1021/acsbiomaterials.7b00094
I found what might be an excellent low cost resource. Used or refurbished 3D printers on ebay. I found by accident looking for 3D stuff on ebay. Creality has their own ebay store where they sell refurbished printers. The bad part, Creality's quality control sucks, bad. These are printers someone has already returned and while they say they refurbish them, some appear for the bad reviews to be thrown in the same box and shipped out again. So why bother? The VERY important point is because they are refurbished ebay, allstate insurance has a two year warranty on these things. They also tend to be slightly over or around half the cost. Still you might say that's a bad deal but if look at the reviews on Amazon the negatives tend to be Exactly the same percentage as they are on the returned, refurbished printers. So you're really not taking much if any more risk and you're getting a much better warranty. Another way to think of it that for any part that may be bad you could upgrade to a far better one, make sure everything works and still be out less money than the one bought new from Amazon, while taking no more numerical risk of getting a bad one. Now it would be nice to have the funds to buy a perfect printer but to get a really good one you're talking $500 or higher. Base model ender 3's you can get for around $100 used. I like the ender 3 V3 SE. I like the dual screws on the "Y" and "Z" axis. This gives you a great platform for stability and the direct drive extruder is far better as opposed to boden cables. Boden cables are likely to be a problem if you speed it up to go faster. The cable friction causes problems though at low speeds it's perfectly fine. SE, not the fastest, but I think you could add parts over time and make it better and better. Creality is not the best but they have so many sold there's a pile of hardware and software add-ons. So over time you could make it what you wanted while cheaply getting into it.
>>30917 Interesting idea. Thanks, Grommet! Cheers. :^)

/robowaifu/ Embassy Thread Chobitsu Board owner 05/08/2020 (Fri) 22:48:24 No.2823 [Reply] [Last]
This is the /robowaifu/ embassy thread. It's a place where Anons from all other communities can congregate and talk with us about their groups & interests, and also network with each other and with us. ITT we're all united together under the common banner of building robowaifus as we so desire. Welcome. Since this is the ambassadorial thread of /robowaifu/, we're curious what other communities who know of us are up to. So w/o doxxing yourselves, if this is your first time posting ITT please tell us about your home communities if you wouldn't mind please Anons. What do you like about them, etc? What brought you to /robowaifu/ today? The point here is to create a connection and find common-ground for outreach with each other's communities. Also, if you have any questions or concerns for us please feel free to share them here as well.
Edited last time by Chobitsu on 05/23/2020 (Sat) 23:13:16.
335 posts and 102 images omitted.
>>30837 Brevity is the soul of wit, I know. In the same vein of mobilizing people ASAP to aid in the robowaifu effort, I was wondering if there were any cloud computing jobs people can donate CPU/GPU power to? For a large chunk of the day my PC is sleeping but I wouldn't mind it being used to help the cause if such an option was available. >"If you know literally nothing but you have an idle GPU/CPU that you hardly utilize, you can help by doing this (insert easy-to-use github link)" i.e. put the robots/ai we have available today to work to make better robots/ai.
>>30853 >I was wondering if there were any cloud computing jobs people can donate CPU/GPU power to? For a large chunk of the day my PC is sleeping but I wouldn't mind it being used to help the cause if such an option was available. We actually have an entire thread dedicated to this exact topic, Anon : (>>8958).
>>30838 Its more about giving people insight into how far along we are. I dont think a lot of people realise how close we can get with the AI side of things. If more people know its more hands on deck
Open file (1.05 MB 1024x1024 00011-3045581562.png)
>>30914 If they have time to poke their head up out of their cubicles or stop door dashing for 5 minutes perhaps. That being said, if you need help with setting up streaming, recording, AI art generation, and other produciton related stuff, I can perhaps contribute if you need an extra hand. You mentioned wanting someone who was okay doing a presentation. While I'm okay speaking, I don't think my voice is anything special. Given the nature of your content, you may want to consider AI voice synth. Sadly, I have no experience in that realm.
>>30914 >>30915 If you two want to go the streaming route, then we already have a Robowaifu Podcast thread that would be a good fit.

ROBOWAIFU U Robowaifu Technician 09/15/2019 (Sun) 05:52:02 No.235 [Reply] [Last]
In this thread post links to books, videos, MOOCs, tutorials, forums, and general learning resources about creating robots (particularly humanoid robots), writing AI or other robotics related software, design or art software, electronics, makerspace training stuff or just about anything that's specifically an educational resource and also useful for anons learning how to build their own robowaifus. >tl;dr ITT we mek /robowaifu/ school.
Edited last time by Chobitsu on 05/11/2020 (Mon) 21:31:04.
134 posts and 72 images omitted.
>>30905 Ahh, got it. Thanks kindly, Anon. Cheers. :^)
>>30907 its also really simple to program if you want to play around with it, its how sqrt() is done when the cpu doesnt have a math chip void main() { float guess = 13; float ans = guess; // newtons method formula = x - f(x)/`f(x) // using f(x)= 3x^3 + 5x^2 - 3x - 2 // `f(x)= 9x^2 + 10x - 3 for ( int i=0; i<50; i++ ) { float fx = 3*(ans*ans*ans) + 5*(ans*ans) - 3*ans - 2; float dfx = 9*(ans*ans) + 18*ans - 3; float guess = ans - (fx / dfx); if ( guess == ans )

Message too long. Click here to view full text.

>>30908 Neat! I assume you'd add some type of Epsilon test (line #14) in for a working example, Anon? --- >also: I notice that the formula you reference in the comments (line #8), seems to be different than the code example (line #12) ? >note: for my line numbers, I'm using my own (re)formatted version of your code example. cf: https://trashchan.xyz/robowaifu/thread/26.html#27 >=== -add image hotlink -minor edit
Edited last time by Chobitsu on 04/14/2024 (Sun) 13:40:02.
>>30909 meant to be 10*ans not 18*ans my bad, good mistake to see how the method works though since you still center in one of the roots even though the derivative is wrong it just takes more iterations lots of things you can do to make it better of course just wanted to keep it simple
>>30910 Ahh, thanks. It's rather nice that the eminent Newton devised such a computationally-simple approach to this need. I'm sure with modern superscalar architectures, the solution would converge in very few clocks. Cheers. :^)

Robotics sites, organizations and projects Robowaifu Technician 09/16/2019 (Mon) 04:21:24 No.268 [Reply] [Last]
There are a lot of robotics research and projects going on out there in the world, many of them with direct application for us here. Please contribute important, quality links you find that you think would help /robowaifu/.

Robot Operating System (ROS)
www.ros.org/
>[ROS] is a flexible framework for writing robot software. It is a collection of tools, libraries, and conventions that aim to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms.
87 posts and 33 images omitted.
>>30210 Yes, I think you're right. It's a strange dichotomy. There's some dynamic of the old turn of speech 'strange bedfellows' at play here IMO. < On the one hand: We here (and other Anons like us), with our clearly muhsoggyknees, literally Hitler desire for loving, helpful, and charming waifus. < On the other hand: Enthusiast groups largely funded by the evil globohomo, who want nothing more than to quadruple-down on an ultra-pozz overdose, gleefully pushing for an end to Whitey for good, and the destruction of the Patriarchy!111!!ONE!! ...both working together for the common goal of AI 'benevolence' towards all mankind! :DD Strange Days indeed, heh. :^) >=== -minor edit
Edited last time by Chobitsu on 03/09/2024 (Sat) 03:55:27.
I came across this soft robotics focused site with instructions for different actuators, mainly pneumatic. Their YouTube channel barely has followers so I may be first here to point this out. https://opensoftmachines.com/
>ROS >Gazebo >Easily install the Robot Operating System (ROS) on any Linux distribution >Want to use ROS, but don't want to run Ubuntu >This project uses the power of Nix make to it possible to develop and run ROS packages in the same way on any Linux machine. https://github.com/lopsided98/nix-ros-overlay/tree/develop
>>30887 Very nice. Thanks NoidoDev, I briefly considered this very topic a few years ago and basically figured it was a non-starter b/c CBA to try and manage moving it off of Ub*ntu . Maybe it's time to reconsider the prospect again though I anticipate there will still linger a lot of dependency hell, since ROS is a sprawling mess with a yuge attack surface.
>>30268 Thanks Anon! Very interesting approaches.

Open file (8.45 MB 2000x2811 ClipboardImage.png)
Cognitivie Architecture : Discussion Kiwi 08/22/2023 (Tue) 05:03:37 No.24783 [Reply] [Last]
Chii Cogito Ergo Chii Chii thinks, therefore Chii is. Cognitive architecture is the study of the building blocks which lead to cognition. The structures from which thought emerges. Let's start with the three main aspects of mind; Sentience: Ability to experience sensations and feelings. Her sensors communicate states to her. She senses your hand holding hers and can react. Feelings, having emotions. Her hand being held bring her happiness. This builds on her capacity for subjective experience, related to qualia. Self-awareness: Capacity to differentiate the self from external actors and objects. When presented with a mirror, echo, or other self referential sensory input is recognized as the self. She sees herself in your eyes reflection and recognizes that is her, that she is being held by you. Sapience: Perception of knowledge. Linking concepts and meanings. Able to discern correlations congruent with having wisdom. She sees you collapse into your chair. She infers your state of exhaustion and brings you something to drink. These building blocks integrate and allow her to be. She doesn't just feel, she has qualia. She doesn't see her reflection, she sees herself reflected, she acknowledges her own existence. She doesn't just find relevant data, she works with concepts and integrates her feelings and personality when forming a response. Cognition, subjective thought reliant on a conscious separation of the self and external reality that integrates knowledge of the latter. A state beyond current AI, a true intellect. This thread is dedicated to all the steps on the long journey towards a waifu that truly thinks and feels.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 09/17/2023 (Sun) 20:43:41.
315 posts and 122 images omitted.
>>30102 >Update LidaPy seems nice, but it too is in a EoL programming language, python2. My only option at this point is to see if anyone can refactor it to python3 or just use very old software to test it. I'm feeling like it might be best to DIY something on a newer platform that follows the lida framework. The LIDA tutorial even repeatedly states: "The Framework constitutes one, but not the only, way of implementing the Model.", like they want you to make a new implementation. Before I go ahead with any work, it's always important to remember to check who owns the rights to any IP. I would be a research associate at a university if it weren't for IP rights, and I've pondered going to memphis to develop LIDA if I would have the right to use it in my bot commerically. I'll post an update if there's any progress.
>>30840 >but it too is in a EoL programming language, python2. >My only option at this point is to see if anyone can refactor it to python3 or just use very old software to test it. Might I suggest an alternative option of having someone rewrite it in C++ , Anon? 40+ years and still going strong today (fully backwards-compatible, of course -- basic C++ code written in the 80's/90's will still build today!) :^) Good luck, Anon. I hope you can succeed with your LIDA research. Cheers. :^)
Potentially useful, potentially ontopic thread on 4cuck/sci/ I was there looking around for the linked thread from our Propaganda thread lol. https://boards.4chan.org/sci/thread/16087430#p16087430
>>30840 Python2 can still be installed. Also with installers like Nix you should be able to install old versions of Java.
>>30863 I looked into it and found that it is not recommended to install Python2 anymore. You can install PyPy or IronPython instead. There seem to also be some other long term support options. I don't know which Java it needs, but JRE8 seems to be in the Nix repo. You can install and run software exclusive to the nix-shell. But I'm new to this myself. I might be able to help a little bit. I also looked a bit into Lida itself and it looks like something how I would've imagined it. I might going to try it out at some point, and when I start to implement something myself I might look use it as a resource. I will most likely learn Elixir while doing it, at least for the any part which is not about number crunching.

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply] [Last]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.

opencv.org/
https://archive.is/7dFuu

github.com/opencv/opencv
https://archive.is/PEFzq

www.robotshop.com/en/cameras-vision-sensors.html
https://archive.is/7ESmt
Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
116 posts and 52 images omitted.
>>29915 Started on the kinect lite guide because I don't want giant XBOX 360 bars on my robot's face. And just now after saying it I regret hacking it apart. It's still huge after making it half the size, the length of a smartphone. https://medium.com/robotics-weekends/how-to-turn-old-kinect-into-a-compact-usb-powered-rgbd-sensor-f23d58e10eb0
>>30877 I know this is a stupid question but can you strip those components right out of the suppoirt frame and have them simply connected to the wires?
>>30879 Zoom in to the whole in the centre. Looks like there is a circuit board under there. If one were to take it out of the frame it would require adding wires and attaching back to the circuit board I imagine.
>>30879 >>30880 I expect the physical positioning of the 3 camera components is tightly registered. Could be recalibrated I'm sure, but it would need to be done.
>>30879 >Depth Perception From what I know these systems work so that it knows the distance between the two cameras and this is part of the hardware. If you want to do this yourself then your system would need to know the distance. I think Kudan Slam is a software doing that: >>29937 and >>10646 >Kudan Visual SLAM >This tutorial tells you how to run a Kudan Visual SLAM (KdVisual) system using ROS 2 bags as the input containing data of a robot exploring an area https://amrdocs.intel.com/docs/2023.1.0/dev_guide/files/kudan-slam.html >The Camera Basics for Visual SLAM >“Simultaneous Localization and Mapping usually refer to a robot or a moving rigid body, equipped with a specific sensor, that estimates its motion and builds a model of the surrounding environment, without a priori information [2]. If the sensor referred to here is mainly a camera, it is called Visual SLAM.” https://www.kudan.io/blog/camera-basics-visual-slam/ >.... ideal frame rate ... 15 fps: for applications with robots that move at a speed of 1~2m/s >The broader the camera’s field of view, the more robust and accurate SLAM performance you can expect up to some point. >...the larger the dynamic range is, the better the SLAM performance. >... global shutter cameras are highly recommended for handheld, wearables, robotics, and vehicles applications. >Baseline is the distance between the two lenses of the stereo cameras. This specification is essential for use-cases involving Stereo SLAM using stereo cameras. >We defined Visual SLAM to use the camera as the sensor, but it can additionally fuse other sensors. >Based on our experience, frame skip/drop, noise in images, and IR projection are typical pitfalls to watch out. >Color image: Greyscale images suffice for most SLAM applications

Message too long. Click here to view full text.


Open file (992.12 KB 844x863 Vitruvian Miku.jpg)
Body Proportions Robowaifu Technician 11/24/2021 (Wed) 01:35:27 No.14388 [Reply]
I don't know if this was covered already in an existing thread, but I thought I'd make a thread dedicated to body proportions. Obviously everyone has different ideas of what they want in a waifu and body proportions are no exception, but for the sake of discussion this is for a realistic adult, human, female of indefinite height. Height is especially tricky, since proportions play a huge role in perceived size. A lot of information on human body proportions easily available online are idealized, rather than realistic, which isn't a problem itself, but most is about men's body proportions, and often tend to self-contradicting. Common examples of idealized proportions are Vitruvian Man or the 'Physical characteristics of the Buddha', neither of which are very useful for waifus a Barbie dolls are a common example of unrealistic beauty standards, but deviate too much from reality to be seen as anything but Barbie dolls. The only consistently agreed on idealized proportions are an arm span matching the height, and legs that make up half of the height. Everything else seems to be completely up in the air. Any thoughts or resources on the subject would be appreciated.
29 posts and 27 images omitted.
Useful anthropometric data for understanding the various measurements of the human body. https://msis.jsc.nasa.gov/sections/section03.htm https://stacks.cdc.gov/view/cdc/100478 https://www.sciencedirect.com/topics/engineering/anthropometric-data >=== -hotlinks patch
Edited last time by Chobitsu on 04/11/2024 (Thu) 08:19:20.
Open file (783.49 KB 808x808 00063-1778171855.png)
I'd err more on the pragmatic side. Make the bot as big as it needs to be to accommodate the necessary internal components to facilitate locomotion. I think the happy medium of size and functionality is around 5 feet tall. I would imagine it would utilize child-like proportions, not to emulate kids but because it'd be cheaper and easier to perform maintenance on a bot with a smaller frame. It'd also be easier to store since it'd take up less space. It doesn't matter if the limb length or hip width is slightly exaggerated if it's for the sake of accommodating a bot's ability to navigate and interact with its environment. As far as the issue of "child sex" doll aspect, I'd simply have the external body looking more mechanical aesthetically.
>>30779 If you get it working, I'll do my best to have an updated archive of JSONs to use with it for you, Anon. Just let me know! Cheers. :^) >>30817 Neat! Thanks, Anon! :^) >=== -add'l resp
Edited last time by Chobitsu on 04/11/2024 (Thu) 08:18:09.
I'll post the resources I use for this. I typically use BDJ's (ball jointed dolls) for proportions.
>>30874 Some art references.

Open file (118.04 KB 800x569 original.jpg)
Robowaifu Psychology Thread Robowaifu Technician 05/07/2020 (Thu) 09:27:27 No.2731 [Reply]
I hope to have useful links and images in the future, this is just a quickly thrown together glorified character sheet maker at this point. Ok so you can program, but HOW to make her thoughts work? At least on a creative level. I don't have much to contribute other than my rather obsessive what-ifs, I hope this is useful somehow. A few questions you might want to ask yourself before typing up some kind of bio or writing down conversations and quotes you imagine she would say are... 1. How close to the canon do I want to be? 2. How much canon is there? 3. How would I want to make her mine vs someone else's interpretation of the same character? Take note of those answers, if your memory sucks record them in a method you are comfortable with. I think typing might be faster for most. And you might want to revisit what you wrote here sometimes when making certain personality design choices. Use your answers here as a basic guide. For the most part, just go through writer's sites for character questionnaires. And before you omit a question, think of how could you use the basics of what it is asking to build your waifu's personality? For example, if a question rubs you off the wrong way politically, still use that question. But answer in your own way or even reword the question. Some of these types of questions are supposed to make you think hard about what shaped your character's dislikes, which is still important to what makes a person themselves. You may need to revisit some of these or even omit certain ones entirely. But try to figure out how to get that info in somehow later. This process can take a long time and be frustrating, but I think it has a place in the waifubot creation experience. Also, try think how would your waifu react if the story went differently at some point. This can get really dramatic real easy, but it doesn't have to. Just start with simple things like what would she say if asked to tell a joke? What does she find funny? What does she find cringey? Things like that, and don't be afraid to make what they call a 'brain dump'. Pretty much just type with minimal breaks and type everything that comes to your mind about the topic. You might get some useful info amongst the 'why am I doing this?' 'I need to take a shit' quotes. Also just use some of those story prompts. Also, try to use the more realistic day to day ones, like things that could happen in real life. Less exciting but pretty sure you aren't going on fantasy journeys with her IRL. Using these types of prompts will give her more to say on mundane everyday things, vs Skyrim politics. (But that could be fun sometimes)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/07/2020 (Thu) 09:43:48.
28 posts and 7 images omitted.
>>24888 Excellent. Moar pls! :^)
I was looking for a software that would help a AI to sort known people into categories based on traits, including psychological traits. Something like a pattern for personas. Anyways, I found something else instead, which would be useful to test a robowaifu and get ideas about how to design her AI: >PsyToolkit’s experiment library https://www.psytoolkit.org/experiment-library/
>>30862 Thanks NoidoDev! I'm quite skeptical of """modern""" Psychological so-called 'Science', personally. Do you think this will be as valuable in designing AI as simple field-tests with volunteer Anons would be? One of the brilliant '12 principles of Animation' [1] is character appeal [2][3]. Arguably, focusing instead on the concrete 'deliverables' of this list (insofar as each might pertain directly to real-world robowaifus) is a more-tractable general approach for us all IMO -- including the development of her AI. Also, it seems to me that as long as we get that one right (character appeal), we'll all be golden at producing great opensource robowaifus -- and with much less 'hemming and hawing' along the way. Any ideas about my points, Anon? --- 1. https://www.youtube.com/watch?v=uDqjIdI4bF4 2. https://www.animationmentor.com/blog/appeal-the-12-basic-principles-of-animation/ 3. This concept also has strong ties to the Greek concept of Ethos. >=== -fmt, prose edit
Edited last time by Chobitsu on 04/11/2024 (Thu) 08:05:10.
>>30864 > I'm quite skeptical of """modern""" Psychological so-called 'Science', personally. That might be somewhat justified but I would throw the child out with the bathwater. Any way of describing a personality should be considered worth contemplating in regards to configuration of the RW personality, but also to think about which topics need to be covered when it comes to work on the AI. Then tests can be used to see if she can have certain skills and traits. On the other hand, understanding people by having some framework should also be useful at some point. > Animation' [1] is character appeal [2][3] Sorry, but this isn't the topic. This is about looks, I was arguing about personality or psychological traits.
>>30866 >that might be somewhat justified but I would[n't] throw the child out with the bathwater. Fair enough. Certainly it's good to devise some type of metrics for robowaifu performance analysis -- including her cognitive functions ofc. >Sorry, but this isn't the topic. This is about looks, I was arguing about personality or psychological traits. Actually, character appeal goes right to the very core of what an interesting person (or robowaifu) is or isn't. It touches on practically every.single.aspect. of robowaifu design & production AFAICT. Certainly, as a software engineer currently focusing on trying to make robowaifus realistic and pleasant to live with, many of the 12 principles are guidestones for the systems-software design -- and none more important than character appeal. I'd recommend you reconsider my advice, as well as examine this particular principle in some depth. Cheers Anon. :^)

LLM & Chatbot General Robowaifu Technician 09/15/2019 (Sun) 10:18:46 No.250 [Reply] [Last]
OpenAI/GPT-2 This has to be one of the biggest breakthroughs in deep learning and AI so far. It's extremely skilled in developing coherent humanlike responses that make sense and I believe it has massive potential, it also never gives the same answer twice. >GPT-2 generates synthetic text samples in response to the model being primed with an arbitrary input. The model is chameleon-like—it adapts to the style and content of the conditioning text. This allows the user to generate realistic and coherent continuations about a topic of their choosing >GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse". That is to say the full model is so proficient in mimicking human communication that it could be abused to create new articles, posts, advertisements, even books; and nobody would be be able to tell that there was a bot behind it all. <AI demo: talktotransformer.com/ <Other Links: github.com/openai/gpt-2 openai.com/blog/better-language-models/ huggingface.co/

Message too long. Click here to view full text.

Edited last time by Kiwi_ on 01/16/2024 (Tue) 23:04:32.
265 posts and 84 images omitted.
Kinda in the wrong thread, we have one specific for voice and speech. But thanks, no problem. You probably didn't find the right one because you need to search for "speech generation" not "voice ...". I put my answer in there: >>30625
Hello robotwaifu, Honestly glad to see a chatbot thread, I usually just lurk here, but glad to see a thread proper for these, and it's a actual discussion I'm so used /g/'s usual chaos, Hmm I've been wondering how to improve my chatbot experience, while I can make great bots for usage, I've been wanting to explore using text to speech to expand on them.
>>30813 If you want advice, I still suggest /g/'s /lmg/. They're quite helpful.
Some guy (Morgan Millipede) started to reverse engineer Neuro-Sama: https://youtu.be/uLG8Bvy47-4 - basically just a humorous introduction on how to do this (he has a $4k computer, though, and she's slower in her responses at the beginning). 4chan responded: https://youtu.be/PRAEuS-PkAk - Her response time improved since the first video.
>>30821 Lol. Thanks NoidoDev, I'll try to make time to look these over. Cheers. :^)

Report/Delete/Moderation Forms
Delete
Report