/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Downtime was caused by the hosting service's network going down. Should be OK now.

An issue with the Webring addon was causing Lynxchan to intermittently crash. The issue has been fixed.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Vera stayed by Anon's side, continuing to support him in building new programs, but their primary goal was no longer work or money or fame.


/robowaifu/meta-7: Hanging down at 7-eleven Chobitsu 02/18/2023 (Sat) 11:00:31 No.20356 [Reply] [Last]
/meta, offtopic, & QTDDTOT General /robowaifu/ team survey (please reply ITT) (>>15486) >--- Mini-FAQ >A few hand-picked posts on various /robowaifu/-related topics -Why is keeping mass (weight) low so important? (>>4313) -How to get started with AI/ML for beginners (>>18306) -"The Big 4" things we need to solve here (>>15182) -HOW TO SOLVE IT (>>4143) -Why we exist on an imageboard, and not some other forum platform (>>15638, >>17937) -This is madness! You can't possibly succeed, so why even bother? (>>20208) -All AI programming is done in Python. Why are you using C++ here? (>>21057, >>21091) -How to learn to program in C++ for robowaifus (>>18749)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/09/2023 (Sun) 12:12:18.
361 posts and 77 images omitted.
Here my current plan for AI hardware: - I ordered a used K80 with 2x12 GB recently, a used one of course, for 100$/95€ shipping included. It's an old GPU, only supported by older CUDA versions and might not run quantified models. Uses much energy, but it's two GPUs with 12GB each. I plan to pair this one after a while with a RTX3060 (12GB, 300€ used or 400€ new I think) in one home server. Context: https://technical.city/en/video/Tesla-K80m-vs-GeForce-RTX-3060 - That one, will then run my 12GB models. For fine tuning, or models which don't run on the K80, I would use the 3060. I don't know yet if I can somehow joint them together and use 3x12GB through the bus. It just seems to need some software support in these programs for running models at home. - I plan to use online services like Colab to learn about how to run these things, but have the K80 for more private data and learning how to do these things at home. - Then I'll get some SBCs, most likely Orange PI's, which can run small models of Whisper (speech regocnition). Also, another small home server with a Intel Arc380 (140-160€), which is fast enough to run the big and better model of Whisper at the speed of one fast human speaker. It does this quite energy efficient. These devices will not run anything else, for security reasons, and be connected to microphones which will be always on. The server will receive the audio through the SBCs from all rooms using the home network (likely on a VPN using Tinc). All of them will send the transcripts to some server in my network which can then decide how to respond. Most likely filtering first for which data is more sensitive than others. - Some small device, like a Raspi, will maybe handle responses based on AIML or using some small model. - Questions which don't contain private information might be send to OpenAI or another service. - The next step up will be getting a M40 (180€) and then a used RTX3090 (700-800€ right now I think), putting them in another home server at some point. Of course I might use this one for gaming till I get even the next GPU. These can handle the models which need 24 GB. The 3090 will do the fine tuning if I want to do that, since it has more power, while the M40 doesn't need as much energy. Context: https://technical.city/en/video/GeForce-RTX-3090-vs-Tesla-M40-24-GB - Then the next step might be getting a AMD XTX (1k-1.2k€) if it's supported well enough for AI by this time. I can use this one for gaming and then put the 3090 in a home server with the M40. If it's possible to combine cards using PCI express, then it might be interesting to think about getting another XTX later, and have 48GB vRAM. - But I hope that either Intel or AMD will come out with a prosumer or consumer card for AI at home, which is rather slow but has 48GB and is not too expensive. (If you buy K80 or M40 on Ebay make sure not to buy the 12GB versions by accident while only looking at the price. They aren't much cheaper. K80 should have 2x12GB and the M40 24GB.)
>>23346 I hope the K80 works for you. I was thinking of getting two but the support for them seems abysmal. They used to be $200 used before shipping. Colab isn't what it used to be either. The free version will boot you off in 30 minutes sometimes or a few hours into training with Pro unless you pay big for compute credits. You're much better off running your own JupyterLab notebook or renting an instance off vast.ai or runpod.io if you don't have access to a GPU.
>>23351 >support for them seems abysmal. I think you need old versions of the software, but I also remember some people taking care of that, to support old GPUs. I might need to compile some of it myself, though. I hope it works out, but the risk isn't very high. >They used to be $200 used before shipping. They're down to $60-70 before shipping now. Some recommend to go straight for the M40, which is much newer, but $120-130 before shipping. >JupyterLab notebook or renting an instance off vast.ai or runpod.io if you don't have access to a GPU. Right, I forgot about those while writing this.
>>23346 Good luck, NoidoDev!
NEW THREAD NEW THREAD NEW THREAD (>>23415) (>>23415) (>>23415) (>>23415) (>>23415) NEW THREAD NEW THREAD NEW THREAD

F = ma Robowaifu Technician 12/13/2020 (Sun) 04:24:19 No.7777 [Reply] [Last]
Alright mathematicians/physicians report in. Us Plebeians need your honest help to create robowaifus in beginner's terms. How do we make our robowaifus properly dance with us at the Royal Ball? >tl;dr Surely in the end it will be the laws of physic and not mere hyperbole that brings us all real robowaifus in the end. Moar maths kthx.
122 posts and 9 images omitted.
I went to google, in desperation, a last resort, and used their translate. He has a site at the school with his publications listed but...no links to the code. I tried searching for the book + code and all sorts of variations. I'm usually reasonably good at finding things but...a big blank on this code. It's also not on the internet archive. There's a possibility that his code, though not exactly conforming to the book, is in his papers as his book seems to be a summation of his papers. You can find his papers here, http://libgen.rs/scimag/?q=Eduardo+Bayro-Corrochano So whatever code you are looking for match the subject with the paper and maybe the code will be in the paper. Or at the least a mathematical representation of what the code is supposed to do.
More searching and I find a page full of software for Geometric Algebra, not his unfortunately but lots. Even in C++. https://ga-explorer.netlify.app/index.php/ga-software/
And look at the publications page for this. It's all about integrating GA with computing and how to go about it. Interesting blurbs, "...Geometric Algebra (GA) in diverse fields of science and engineering. Consequently, we need better software implementations...For large-scale complex applications having many integrating parts, such as Big Data and Geographical Information Systems, we should expect the need for integrating several GAs to solve a given problem. Even within the context of a single GA space, we often need several interdependent systems of coordinates to efficiently model and solve the problem at hand. Future GA software implementations must take such important issues into account in order to scale, extend, and integrate with existing software systems, in addition to developing new ones, based on the powerful language of GA. This work attempts to provide GA software developers with a self-contained description of an extended framework for performing linear operations on GA multivectors within systems of interdependent coordinate frames of arbitrary metric. The work explains the mathematics and algorithms behind this extended framework and discusses some of its implementation schemes and use cases..." another paper, "...Designing software systems for Geometric Computing applications can be a challenging task. Software engineers typically use software abstractions to hide and manage the high complexity of such systems. Without the presence of a unifying algebraic system to describe geometric models, the use of software abstractions alone can result in many design and maintenance problems. Geometric Algebra (GA) can be a universal abstract algebraic language for software engineering geometric computing applications. Few sources, however, provide enough information about GA-based software implementations targeting the software engineering community. In particular, successfully introducing GA to software engineers requires quite different approaches from introducing GA to mathematicians or physicists. This article provides a high-level introduction to the abstract concepts and algebraic representations behind the elegant GA mathematical structure. ..." https://ga-explorer.netlify.app/index.php/publications/ I'm getting the feeling that using this framework GA you can repeat it over and over. Saving computing resources and making all computing in one big scheme that can be repeated with far less resources. Now this is VERY MUCH like that Rebol programming language that I blathered so much on. One of it's BIG strengths is this unifying character of "series list" and the manipulation of them. It's why Rebol can make all these different functions in the software package and still be a megabyte. I see this sort of thing all over the place. I want to emphasize I'm not a math wiz, or even a fizzle, but I'm ok at recognizing patterns. I see a lot of computing doing this sort of thing. Like Plan 9 operating system and the QNX operating system. They use to great effect the idea of making everything in the code pass messages instead of a mish mash of pointers and other such drivel. A counter to show the difference. Linux is old school, mish mash, so it's a huge hair ball of mass and dreckage, While QNX and Plan 9 are light tidy things. L4 microkernel family does this also. In fact it was a dog at speed until they changed it to pass messages then it took off. I think they use a version of this in F-16's as the OS. Now I also know next to nothing about AI but I do know it's a huge mass of matrix manipulations. And it's very likely, like Maxwell's Quaternion calculations, that GA can whittle it down to size. It may be that the same sort of resource compaction can be done in the case of AI with GA also. Or maybe not. One more link https://hackaday.com/2020/10/06/getting-started-with-geometric-algebra-for-robotics-computer-vision-and-more/
>>23145 There's a library for that called opencv. You can do it from scratch if you want though.
>>23143 >>23144 Thanks Grommet! We'll keep looking from time to time. :^) >>23147 Thanks for the info Anon. OpenCV is pretty amazing IMO.

ITT: Anons derail the board into debate about Christianity :^) Robowaifu Technician 04/02/2020 (Thu) 02:24:54 No.2050 [Reply] [Last]
I found this project and it looks interesting. Robots wives appeal to me because i'm convinced human woman and humans in general have flaws that make having close relationships with them a waste of energy. I'm a pathetic freshman engineering student who knows barely anything about anything. Honestly, I think current technology isn't at a place that could produce satisfying results for me at least. I'd like something better than an actual person, not a compromise. Even then the technology is there, I have my doubts it'll be affordable to make on your own. Fingers crossed though. Anyway, what kind of behavior would you like from your robot wife? I'd like mine to be unemotional, lacking in empathy, stoic and disinterested in personal gain or other people. I think human woman would be improved if they were like that. Sorry if this thread is inappropriate.
Edited last time by Chobitsu on 04/06/2020 (Mon) 16:00:20.
155 posts and 68 images omitted.
>>20887 How Does God Love Us? God loves everyone. And he genuinely desires their salvation. This should come as a wonderful message for anyone who is honest with himself about his immoral actions and sinful heart. God need not love us. After all, he is an eternal and triune being, whose love for himself as Father, Son, and Holy Spirit is self-sufficient and infinite. Hence, God loves us wholly and solely from his grace. There are many points of relevance and application we can walk away with in this brief study. Here we will concentrate on two. First, because Scripture and sound reason confirm for us that God truly loves everyone and desires their salvation, each one of us can be assured of God’s genuine and saving love for us. That is, if God loves everyone, I must conclude that God loves me. Hence, we should never conclude that, whenever we sin, doubt, or even fall away from the faith for a season, that God is in any way causing us to do this. Indeed, he tempts no one to sin (James 1:13), and wishes no one to doubt (James 1:5–8). Thus, whenever we sin, doubt, or fall away, we must recognize that these actions are wholly self-determined on our part. Second, because God truly loves everyone and desires the salvation of all, the Christian should never see a nonbeliever as his enemy, but as someone God wants to be saved. As apologists, we ought to recognize that there are many different types of people and, because God desires their salvation, he has reasons available to draw them to himself. To the rationalist, we offer rational arguments for the faith; for the empiricist, we offer science; for the historian, we offer evidence from the Bible; for the artist, we offer beauty. The universal love of God should encourage us to be ready to offer different kinds of reasons for the hope within us (1 Peter 3:15). >t. Travis Campbell Endnotes 1. More than one objection to this proposal can be raised, but for purposes of brevity and to focus on the universal aspect of God’s love, I chose to address only one. For a fuller development of these arguments for the universality of God’s saving love, see Travis James Campbell, The Wonderful Decree: Reconciling Sovereign Election and Universal Benevolence (Lexham Press; forthcoming). For a slightly different approach, see D. A. Carson, The Difficult Doctrine of the Love of God (Wheaton, IL: Crossway; 2000). Dr. Carson also has helpful lectures on this topic that can be found here and here. 2. Technically, the antecedent of the word “any,” in 2 Peter 3:9, is “you.” But Sproul’s question remains valid. Is God not wanting any of you to perish? Well, what does he mean by “you”? Is God not wanting any of you humans to perish? Or is God not wanting any of you readers of my epistle to perish? Or is God not wanting any of you elect persons, chosen unto salvation, to perish? 3. R. C. Sproul, Chosen by God (Wheaton, IL: Tyndale House Publishers, Inc., 1986), 197. 4. James R. White, The Potter’s Freedom: A Defense of the Reformation and a Rebuttal to Norman Geisler’s Chosen But Free (Amityville, NY: Calvary Press, 2000), 145–50.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/28/2023 (Tue) 18:12:52.
>How Should Christians Think About Artificial Intelligence? with Sean McDowell https://www.youtube.com/watch?v=8R8qudyaNio I started to link this in the /meta or Society thread, b/c it's got plenty of commentary that's pertinent to robowaifus, but it's clearly coming from the Christian perspective, so I'll put it here instead.
The great John Lennox actually mentions robowaifus briefly during this -- from 8 years ago lol. >The Loud Absence: Where is God in Suffering? | John Lennox at Harvard Medical School https://www.youtube.com/watch?v=MPm6Y-pANYI >=== -minor fmt
Edited last time by Chobitsu on 06/10/2023 (Sat) 21:31:02.
>>2089 I agree with this very much. I think we have the computing power, and I mean normal desktop for thought and higher stuff, and microcontrollers for muscle movement, to get a waifu that can walk around, recognize you. Very limited verbal ability. Yes, no, maybe, ok, etc. I think we can do that now. Maybe even follow very simple commands like a dog with a strong GPU to process verbal commands. However this simple thing can rapidly grow to be far more as processing power doubles. A couple of more doubles and I think you could have some basic conversation and maybe clean the house(remember I'm talking about cheap consumer processors). Maybe even with some serious programming sweat equity you could get it to cook. And by then it should be able to blow your mind sexually. I think there's a strong possibility that you could do the higher level functions right now with consumer grade, very expensive, but consumer, GPU's with A LOT of training. It would be like training a two year old. Constantly telling it do this, don't do that but over time the reinforcement on a "few" task would allow it to do them well. I think the key for near term is to stick to a few basic task and low level verbal commands and not expect serious philosophical discussions. Of course who wants that anyways. Zsa zsa Gabor once said that dealing with Men is easy. Make sure they got adequate sex, a clean place to live in with their clothes clean and tidy and three meals a day. I would add limited bickering or nagging. If they wanted something. No more than two mentions of what they want a day. So if something gets missed, it and a new thing could be noted, once, then no more. I think 99% of most Men would be happy with that.
>>23130 >I think 99% of most Men would be happy with that. I think you're right. :^)

Open file (3.22 MB 3120x4160 Hello_Anons.jpg)
Sophie Bot STL Files Uploaded Robowaifu enthusiast 07/15/2020 (Wed) 20:08:20 No.4198 [Reply]
I need to sort out her CAD files more before uploading them, but the .STLs are ready. Link to Google Drive shared folder: https://drive.google.com/drive/u/0/folders/1xWilMfWDZnrt30E1Uw7hlWe6JmaigKQF
15 posts and 2 images omitted.
not sure what this link was, but without any context and using a url shortener im assuming its cp if this was something on topic, my apologies, but with all these cp bots trying to advertise here we have to be careful
Edited last time by gator on 06/06/2023 (Tue) 15:02:46.
>>22974 Its literately google drive
>>22974 I take it you rm'd a link for us? That's fine if it was suspicious looking, thanks! :^) So, if you were a legitimate poster from our board, please at least explain what a link is, if it's otherwise unclear to an uninitiate. Thanks. >>22975 >Its literately google drive Not having seen it, I can't confirm this one way or another. But I'm uncertain that 'it's G*ogle' is a solid validation Anon.
>>22975 If it was, my bad, since it was behind a link shortener I couldn't tell. Just a single line of text explaining what it was would have been enough for me to tell it was human though. >>22978 Yeah just a single link run through a link shortener, many of which we've outright filtered at this point simply because of how badly the cp posters abuse them. While we obviously won't ban link shorteners, if you're gonna use them making clear it's posted by a human is a good idea, since otherwise it looks nearly identical to the cp bots.
Edited last time by gator on 06/06/2023 (Tue) 22:30:43.
>>22979 Got it. Thanks Gator. :^)

Embedded Programming Group Learning Thread 001 Robowaifu Technician 09/18/2019 (Wed) 03:48:17 No.367 [Reply] [Last]
Embedded Programming Group Learning Thread 001

Greetings robowaifufags.
As promised in the meta thread, this is the first installment in a series of threads where we work together on mastering the basics of embedded programming, starting with a popular, beginner-friendly AVR 8-bit microcontroller, programming it in C on linux.

>why work together on learning and making small projects that build up to the basis of a complete robot control system instead of just posting links to random microcontrollers, popular science robot articles, and coding tutorials and pretending we're helping while cheerleading and hoping others will do something so we don't have to?
Because, dumbass, noone else is going to do it. You know why in emergency response training they teach you to, instead of yelling "somebody call an ambulance!," you should always point to or grab someone and tell that person to do it? Because everyone assumes someone else will do it, and in the end, noone does. Well, I'm talking to YOU now. Yeah, you. Buy about 20 USD worth of hardware and follow the fuck along. We're starting from zero, and I will be aiming this at people with no programming or electronics background.

>I suppose I could get off my ass and learn enough to contribute something. I mean, after all, if all of us work together we can totally build a robowaifu in no time, right?
No, the final goal of these threads is not a completed robowaifu. That's ridiculous. What we will do though, by hands-on tackling many of the problems facing robot development today, is gain practical and useful knowledge of embedding programming as well as a more grounded perspective on things.

>so we're just going to be blinking a bunch of LEDs and shit? lame.
Not quite. We will try to cover everything embedded here: basic I/O, serial communications, servo/motor control, sensor interfacing, analog/digital conversion, pulse-width modulation, timers, interrupts, I2C, SPI, microcontroller-PC interfacing, wireless communications, and more.
125 posts and 16 images omitted.
>>22890 >pages are now execute only or no execute. For the uninitiate you could say that this helps keep corrupt (ie, 'hacked') code from executing. So Nagisa, off-topic; but what do you think would be involved in a practical sense of creating a robowaifu system based on OpenBSD? Remember that we have several hard-real-time constraints (though most isn't under this restriction). By this question I mean primarily her onboard systems, not just a home server setup.
>>22891 OpenBSD is the worst OS for real time among the ones I've used, its task scheduler has really bad fairness guarantees and big locks in the kernel can cause most of the kernel's functionality to block while one program uses it. The audio system defaults to 160ms latency and still gets audio drops, on Gentoo Linux I could get ~17-19ms with ALSA and no realtime tweaking. We all have much to gain from portability though. OpenBSD's strong memory protections can catch memory bugs that go unnoticed on every other OS. And while doing that, it's still fast enough that you can actually run your program and test it, you can't use e.g. Valgrind on a typical video game because then it will run at sub-1fps. OpenBSD's pthreads implementation catches destroying mutexes with waiters, mpv has that bug all over, Linux libcs don't do this. This goes for other platforms too, for instance, the diet libc for Linux warns when you use a libc function that makes binaries large, it's good for when you're optimizing binary sizes. I've fixed bugs in programs that I found because I ported the program to MSVC and Microsoft's compiler correctly warned where no other compiler warned.
I'm going to make the flashing leds either tomorrow or the day after tomorrow again.
>>22892 Thanks Anon! Yes that makes sense about realtime. I'm sure we'll figure things out in the end, but r/n it's a big giant puzzle. >We all have much to gain from portability though. Excellent point. It's certainly something to strive for in all our code, to the extent feasible. Certainly during R&D prototyping, I'd say it's a high priority to attempt testing on a wide array of systems. >I ported the program to MSVC and Microsoft's compiler correctly warned where no other compiler warned. They do have a really good debugger system. Ofc some would claim they needed to heh. :^)
>>22895 Please let us know how it goes Anon! :^)

Hand Development Robowaifu Technician 07/28/2020 (Tue) 04:43:19 No.4577 [Reply] [Last]
Since we have no thread for hands, I'm now opening one. Aside the AI, it might be the most difficult thing to archive. For now, we could at least collect and discuss some ideas about it. There's Will Cogleys channel: https://www.youtube.com/c/WillCogley - he's on his way to build a motor driven biomimetic hand. It's for humans eventually, so not much space for sensors right now, which can't be wired to humans anyways. He knows a lot about hands and we might be able to learn from it, and build something (even much smaller) for our waifus. Redesign: https://youtu.be/-zqZ-izx-7w More: https://youtu.be/3pmj-ESVuoU Finger prototype: https://youtu.be/MxbX9iKGd6w CMC joint: https://youtu.be/DqGq5mnd_n4 I think the thread about sensoric skin >>242 is closely related to this topic, because it will be difficult to build a hand which also has good sensory input. We'll have to come up with some very small GelSight-like sensors. F3 hand (pneumatic) https://youtu.be/JPTnVLJH4SY https://youtu.be/j_8Pvzj-HdQ Festo hand (pneumatic) https://youtu.be/5e0F14IRxVc Thread >>417 is about Prosthetics, especially Open Prosthetics. This can be relevant to some degree. However, the constraints are different. We might have more space in the forearms, but we want marvelous sensors in the hands and have to connect them to the body.

Message too long. Click here to view full text.

90 posts and 28 images omitted.
>>22710 it better be able jack me off too.
>>20643 Yes many or all of us have seen this. We have two whole threads, one on humanoid robot videos https://alogs.space/robowaifu/res/374.html .... and another on waifu development projects https://alogs.space/robowaifu/res/366.html, it has at least been mentioned in the first one since it's not a gynoid. Here their YouTube: https://youtube.com/@CloneRobotics - they had a different name a while ago (Automaton, I think) >The power consumption of just moving a single hand with these artificial muscles is eye-watering. Okay, I don't remember that part, tbh
>>20643 I disagree I think the tech is here right now and its a race against time to see who gets there first which is why I'm kind of semi panicky.
okay so I definetly want to start with the robot hand but which robot hand tutorial should I follow? which robot hand do we want on the waifu? Should I follow the tutorial or should someone engineer it from scratch? I don't think I can engineer it from scratch...
>>22785 I don't know what you mean by doing it from the scratch. Of course you would look into tutorials. Big problem with many hands is that they're not about bones plus soft material. But if you go for that, you will most likely need to make some elements out of metal.

Privacy, Safety, & Security General Robowaifu Technician 04/20/2021 (Tue) 20:05:08 No.10000 [Reply] [Last]
This thread is for discussion, warnings, tips & tricks all related to robowaifu privacy, safety & security. Or even just general computing safety, particularly if it's related to home networks/servers/other systems that our robowaifus will be interacting with in the home. --- > thread-related (>>1671) >=== -update OP -broaden thread subject -add crosslink
Edited last time by Chobitsu on 02/23/2023 (Thu) 13:31:28.
71 posts and 14 images omitted.
>software-security/corruption -related: (>>23061)
> general-mobility safety convo -related: (>>23824, ...)
> posts-related : (>>25308, >>25330)
> surveillance-related : (>>26151)

Open file (13.41 KB 750x600 Lurking.png)
Lurk Less: Tasks to Tackle Robowaifu Technician 02/13/2023 (Mon) 05:40:18 No.20037 [Reply]
Here we share the ideas of how to help the development of robowaifus. You can look for tasks to improve the board, or ones which would help to move the development forward. You could also come up with a task that needs to be worked on and ask for help, use the pattern on top of OP for that, replace the part in <brackets> with your own text and post it. >Pattern to copy and adjust for adding a task to the thread: Task: <Description, general or very specific and target thread for the results> Tips: <Link additional information and add tips of how to achieve it.> Constraints and preferences: <Things to avoid> Results: Post your results in the prototypes thread if you designed something >>18800, or into an on-topic thread from the catalog if you found something or created a summary or diagram. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800, current meta >>18173 and many others in the catalog https://alogs.space/robowaifu/catalog.html - the thread for posting the result might also be the best place to discuss things. >General suggestions where you might be able to help: - Go through threads in the catalog here https://alogs.space/robowaifu/catalog.html and make summaries and diagrams like pointed out starting here >>10428 - Work on parts instead of trying to develop and build a whole robowaifu - Work on processes you find in some thread in the catalog https://alogs.space/robowaifu/catalog.html - Test existing mechanisms shared on this board, prototypes >>18800 - Try to work on sensors in some kind of rubber skin and in parts >>95 >>242 >>419 - Keep track of other sites and similar projects, for example on YouTube, Twitter or Hackaday. - Copy useful pieces of information from threads on other sites and boards talking about "sexbots", "chatbots", AI or something similar. Pick the right thread here: https://alogs.space/robowaifu/catalog.html

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/08/2023 (Mon) 11:17:16.
6 posts omitted.
Task: Test our (ball) joint designs for dolls Tips: Here >>8690 and >>14704. Give your opinion on if you could use it for a doll. You can also test this here >>16803 if you show interest in building an adjustable joint mechanism for a doll. The file should be behind the Mega.nz link. Constraints and preferences: You should maybe have handled dolls before, so you could tell if any of this is useful, or where it is lacking. General Disclaimer: Don't discuss your work on tasks in this thread, keep it in the thread for prototypes >>18800, or current meta >>18173 if it's more general. If you want to read up and discuss how to build a doll, aside from the mentioned prototypes, then this thread >>372 might be also of interest to you.
Open file (915.12 KB 1280x720 team_MaidCom_ftw.png)
Task: Choose your own task from the MaidCom project (>>15630). task areas: • physical body .CAD modeling her body • internal mechanisms .designing internal components • consolidate relevant information .the information is scattered across several threads, and is therefore hard to parse • create robowaifu software for her .both AI, component, and systems software; see (>>22, >>367, >>20074, >>14409) Tips: This is the board's group robowaifu project. Kiwi is the project lead, and is primarily focusing on the internal mechanisms task right now. If you can do the CAD modeling of her body, collect information together, create software, or anything else important that you can think of, it would be a big help. Please participate, Anon! Constraints and preferences: Just be sure to read through the posts in the project thread first, both to get an idea of where we're going, and also to see how you might help out. We can do this if we all work together... all of us is far stronger than any of us! General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800, current meta >>18173 and many others in the catalog https://alogs.space/robowaifu/catalog.html - the thread for posting the result might also be the best place to discuss things.
Edited last time by Chobitsu on 02/14/2023 (Tue) 13:01:16.
Task: Cycloidal drive design: Search for some, design some, test them. We will most likely need those. It should work in Blender or OpenScad and be adjustable in size and reduction, not some STL file. Tips: It's about getting the math right and translated into a modifiable file. Also, there different variants. Related: >>14513 >>16692 >>4536 Constraints and preferences: I think the patent for it is still valid, so maybe don't post anything which can be traced back to you. Though, others did that and they seem to be fine. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800 for posting your tests, and current meta >>18173 for more general questions.
Task: Start making a list of all human-like mental tasks current AI systems or software with CLI can solve, with links to the paper or name of it. Tips: Use this >>10317 to look into the topics of your downloaded papers. Ask around on other platforms. Constraints and preferences: Don't limit it to deep learning. Limit it to tasks humans would do in their mind, not protein folding or something like it. Maybe some more specialized tasks can go into it as well, but it's about common mental tasks. Be realistic about the quality of it, LLM don't know the things they're talking about. If tasks come to mind which they can't solve, or you find a list somewhere, add this as "unsolved" into the results as well. Results: Post your results in >>27 or in >>4143 for now, maybe make a new thread after getting deeper into it. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. The thread for posting the result might also be the best place to discuss things.
Task: Think about the config options robowaifus should have, in terms of behavior, opinion and psychology. Make lists sorted by some criteria or pattern, and describe how you mean it. Tips: Complex tests and categories for political views and psychology might help, but try to think of more. Behavior in conversations, sarcasm, ... Traits used for characters in entertainment might be a useful list, ... Constraints and preferences: No idea right now. If you make it too simplistic it can be extended later. The terms should be as unabigious as possible. Results: Posting it in the threads for personality >>18 or psychology >>2731 might be the right place. You can also pick another on-topic thread from the catalog if you prefer. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. The thread for posting the results might also be the best place to discuss things.

Robowaifu references Anonymous 09/09/2019 (Mon) 00:09:49 No.1 [Reply] [Last]
My favorite robowaifu is Chii. I'd like to see yours.
114 posts and 116 images omitted.
>>21953 There's an amazing variety of cute Emmys!
>>21954 there really is
>>22450 Second image appears to be from a different artist?

Open file (1.08 MB 1978x802 IMG_20210725_103858.jpg)
Bot Shitposting Bread Robowaifu Technician 07/27/2021 (Tue) 09:59:33 No.11754 [Reply] [Last]
M boy need so many booboo why not just give them otherwise it ll explode like the old chinese emperor or something not getting involved going away giving up some things,trash and whatnot >=== -add thread subject
Edited last time by Chobitsu on 07/27/2021 (Tue) 12:26:28.
78 posts and 10 images omitted.
>>22238 >My GPU is busted Bummer. Really sorry to hear that. Can we somehow set up a way to all chip in to help get you a good replacement do you think? You're one of our best AI researchers after all!!
>>22243 I appreciate any help I can get. I just setup a Patreon: https://twitter.com/robowaifudev/status/1653190581580107776 There's a Monero address on the bottom of the about page if anyone is concerned about anonymity
>>22238 >>22257 >8x NVIDIA A100 80 GB 240 1800 GiB 20 TiB $12.00 / hr https://lambdalabs.com/service/gpu-cloud
>>22344 A100 is unnecessary. With OpenDelta modified with gradient checkpointing support, just need RTX 3070s which are $0.10/hr on vast.ai or $0.12/hr for 3060s or $0.18/hr for 3080s. Partition the data (ideally one task per instance), spin up multiple instances and merge the weights after. Optionally, spend some time weighting them accordingly. I don't have code for it yet but it'd be possible to optimize the merge weights to some training data. Have a lot of stuff to do but on Monday I'll clean up and post the training code I made for finetuning 2.7B models on toasters with only 6 GB if anyone wants to give it a shot.
>>22349 >Have a lot of stuff to do but on Monday I'll clean up and post the training code I made for finetuning 2.7B models on toasters with only 6 GB if anyone wants to give it a shot. That sounds excellent Anon! Please do so.

Report/Delete/Moderation Forms
Delete
Report