/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality!

We are back again (again).

Our TOR hidden service has been restored.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


“Boys, there ain’t no free lunches in this country. And don’t go spending your whole life commiserating that you got the raw deals. You’ve got to say, I think that if I keep working at this and want it bad enough I can have it. It’s called perseverance.” -t. Lee Iacocca


Open file (8.17 KB 224x225 index.jpg)
Wheelchair Waifus Robowaifu Technician 05/12/2020 (Tue) 00:38:56 No.2983 [Reply] [Last]
Ideally our robowaifus would be MORE able than normal humans, but we just aren't there just yet. As we all know there are lots of engineering challenges to building a full sized robowaifu. One solution to these is to just build smaller waifus as >>2666 suggested, but I am assuming most of us don't want a short stack. My solution to the problems involved with balance, heating, and power requirements is just to give our prototype robowaifu crutches and a wheelchair. It would be much easier for our robowaifu to walk with crutches than on her own two legs and in the beginning she would probably only be able to walk short distances even with the crutches. An electric wheel chair would solve the issue of balance entirely. A wheel chair could also have a number of different systems and components mounted to it. Batteries, cooling fans, sensors, and processing units could all be mounted to the wheelchair, and we don't have to worry about the chair being pretty. A cripplebot, while not an ideal final product, would be great for prototyping designs and systems that could be used for later designs especially those relating to sensors, bot navigation, and movement. Our prototype could also be fully sized as well. What do you think /robowaifu/, should our first prototype be a cripple?
Edited last time by Chobitsu on 06/03/2020 (Wed) 06:21:29.
66 posts and 34 images omitted.
>>24162 >Locomotion and capacity for a basic conversation are my first goals. Locomotion should be reasonably like modern R/C toys such as cars. So, a relatively-solved problem today. Conversation? Haha there's conversation and then there's conversation. Obviously chatbots have made leaps and bounds over the past year, and I predict they will continue to do so. However, human-level intelligence (HLI) conversations are so far out on the frontiers of research ATM, that I give it even odds whether we'll pull it off in our lifetimes. Don't get me wrong -- I believe we will. But the fact is that if that's the form of conversation you're going for here, then you are biting off what is arguably one of the most difficult human endeavors in history Anon. IMO much better to go for something much more modest like chat-tier inferencing first, then solve all the many-and-varied humanoid robotics tasks thereafter. In the meantime AI will continue advancing. I hope all that made sense Kiwi. Just ask if you want me to reword things (hopefully) more clearly. Cheers.
Edited last time by Chobitsu on 07/22/2023 (Sat) 04:37:07.
>>24162 >Progress report, she balances even better than I thought she would. The suspension integration into her motor holders works a treat. Wheels work great. Looking forward to her bettery coming in the mail to test her under her own power. Neat! I presume the divots of your main wheels are providing convenient detents for the system to find a stable balance point? Good luck with the battery/charging systems Kiwi! Any idea how you're going to solve motion control yet? I'd think some kind of hardwired, or remote joystick system may be feasible during this early, interim prototyping phase. BTW, I really like how you're building up your frame for this base unit piecemeal, and using the motor housings themselves as structural elements for the system. This kind of double-duty thinking will make things lighter & cheaper. (cf. 5 principles thread : >>24047) >>24166 >like those useless roombas, Heh, so far the single most successful consumer robot in history. I'd suggest they're of some usefulness to their masters. BTW good suggestion about the charging docks, Anon. Helpful ideas thanks. Cheers. >=== -prose edit
Edited last time by Chobitsu on 07/22/2023 (Sat) 01:18:22.
Open file (251.91 KB 593x719 NextMaSiRo.png)
The next generation MaSiRo is heading in a similar direction to MaidCom
>>24254 Neat! I think that there are similar service needs between the two projects, so tbh that outcome doesn't surprise me. How did you find out about the changes, I'm curious.
Related: >>24744 >Modular Platform, "Assistant": Wheelchair-style robot base to jumpstart projects by Lin

Open file (686.85 KB 2764x2212 Prima_Doll.jpg)
Open file (174.99 KB 720x960 Trash_Press.jpg)
Open file (359.85 KB 1200x675 ChatGPT_hustle.png)
Open file (29.42 KB 622x552 LLaMA02.png)
General Robotics/A.I./Software News, Commentary, + /pol/ Funposting Zone #3 NoidoDev ##eCt7e4 03/06/2023 (Mon) 18:57:17 No.21140 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, and any social or economic issues surrounding it (especially of robowaifus). === -note: This OP text might get an update at some point to improve things a bit. -previous threads: > #1 (>>404) > #2 (>>16732) >=== -edit subject
Edited last time by Chobitsu on 07/08/2023 (Sat) 13:10:41.
351 posts and 84 images omitted.
Open file (329.39 KB 850x1148 Karasuba.jpg)
Open file (376.75 KB 804x702 YT_AI_news_02.png)
Open file (423.49 KB 796x706 YT_AI_news_01.png)
Suggestions for the new thread pics. I picked someone from Prima Doll again, but maybe the new tradition should rather be to pick one from the newest good related anime. I think this might still be Prima Doll. At least the newest I've seen. I was thinking about using the girl from ATRI. Maybe next time. The other idea here was the relation between reading the news and having some coffee. In that case we would mostly be stuck with Prima Doll, at least if the girl serving is also supposed to be a robot.
>>24077 They look fine Anon. I'm perfectly fine with animu grills as the opener pic. I'd personally suggest leaving your options of choice wide open. Just that the opener have great waifu appeal, which your choice here does. >tl;dr Go for it NoidoDev! :^)
Open file (449.75 KB 2048x2048 1687959908288867.jpg)
>>24080 Thanks. If people have suggestions for future opener pics I'm listening. Picrel might be it at some point. But currently it's hot in many places, so the pic of Karasuba with some iced drink fits very well.
New Thread New Thread New Thread >>24081 >>24081 >>24081 >>24081 >>24081 New Thread New Thread New Thread
>>24082 > Picrel might be it at some point. I think that's fine tbh.

(Robo)Waifu personality thread Robowaifu Technician 09/09/2019 (Mon) 05:26:21 No.18 [Reply] [Last]
Is she going to be tsundere? Deredere? Yandere or a combination? How would you code your waifus personality? Where do you draw inspiration from and can personality even be classified and successfully coded into AI? > (>>17027 - related-thread, emotions) >=== -add related-thread crosslink
Edited last time by Chobitsu on 07/26/2022 (Tue) 03:50:30.
71 posts and 36 images omitted.
I've been thinking recently about how to organize and describe personas. While text generation AI is capable of working with unstructured text, AI in general seems to work better when the data is given some sensible structure for the task at hand. Even if a good structure isn't directly necessary for text generation, it's useful for understanding exactly what sorts of things can go wrong and what sorts of data you need to provide to get results you want. I ended up settling on 8 questions that seem to be good for getting a very comprehensive understanding of people: 1. What are they receptive to? 2. What kinds of thoughts do they have? 3. What kinds of goals do they set for themselves? 4. What do they believe? 5. What roles do they take on in life and in interactions? 6. What do they have strong emotions about? 7. How do they act on their emotions? 8. How do they believe things should work? I think questions like these can be useful for being very specific about what an AI gets right or wrong about a persona. It might even be possible to collect data on each of these 8 aspects. On thinking about what the most useful first steps would be for creating AI with a personality, any initial steps towards data collection seems like a great step forward. I don't know exactly how the data collection would work, but maybe there's a way to use natural language processing to extract candidate answers to these questions from fanfictions, show transcriptions, community discussions, and so on. It'd be really cool if that would let us create some sort of "persona dataset" for classifying and generating personas. From what I understand, it's not too difficult for an ML researcher to integrate basically any dataset into a text generation model. If anyone else has a different set of questions, it's be great to see that. (The download link in >>12299 seems to be broken.) Anything that helps us be more specific and more comprehensive about what exactly constitutes a persona seems worthwhile, especially since there seems to be so little academic work on the topic.
>>17248 Thanks, I had a similar idea, I guess. I was mainly thinking about personas in regards to limit the stored data on persons she knows. The robowaifu mind would have a concept of different personas and then forget details about persons she met over time, but keep them categorized as a certain type of person, or within a range of close types. This might of course also help from the start. Having a guess which kind of person someone is, and then adapting her behavior accordingly. >Anything that helps us be more specific and more comprehensive about what exactly constitutes a persona seems worthwhile Try not to overplan things. It just needs to be a framework which can be extended. It should of course also include race, gender, political tendencies, heritage and nationality, religion, etc. I'm sure we can find an article on Facebook and advertisers where they list what kinds of factors they care about, then some psychological classifications, sociological mileus, ... From there we can weed out the more important factors.
>>17257 >Try not to overplan things. It just needs to be a framework which can be extended. Part of this is just me trying to figure out my own thoughts on the topic. I think part of it is also that we have different use cases in mind. I'm thinking more of a one-on-one chatbot, where things like race, politics, heritage, nationality, and religion don't play a big role, at least not for me. Do these play a big role in one-on-one discussions for other people? >I'm sure we can find an article on Facebook and advertisers where they list what kinds of factors they care about Keep in mind that they'll deliberately avoid using or suggesting some important things for legal and reputation reasons. Some big ones not on your list include age, parental status, income, occupation, and education. The search term for this is "demographics". When it comes to creating lists like this, I find GPTs to be very useful. I gave https://20b.eleuther.ai/ this input: >Best demographics for targetting a message: >- Gender >- Politics >- Cultural background >- Religion >- Age >- Parental status >- Income

Message too long. Click here to view full text.

>Give me an exhaustive list of all psychological traits and personality traits, which are being used to describe and diagnose humans. ChatGPT: Describing an exhaustive list of all psychological traits and personality traits used to describe and diagnose humans is challenging, as there are numerous theories, models, and assessments in the field of psychology. However, I can provide you with a comprehensive list of commonly recognized traits: 1. Extraversion 2. Introversion 3. Neuroticism 4. Emotional Stability 5. Openness to Experience 6. Conscientiousness 7. Agreeableness 8. Honesty-Humility 9. Empathy 10. Resilience 11. Assertiveness 12. Self-Esteem

Message too long. Click here to view full text.

>>23495 >>Give me an exhaustive list of all psychological traits and personality traits, which are being used to describe and diagnose humans. Excellent prompt, NoidoDev. GG.

Open file (363.25 KB 1027x1874 MaidComRef.png)
MaidCom Development Kiwi 03/16/2022 (Wed) 23:30:40 No.15630 [Reply] [Last]
Welcome to the /robowaifu/ board's project. Together, we will engineer a modular robot that will serve and provide companionship to their Anon faithfully. See picrel for details on the current design. This robot will begin with a basic maid robot then move forward towards more capable robots with functionality approaching Chii/2B/Dorothy. First goal is to have a meter tall robot which functions as a mobile server bearing an appearance that approximates her owners waifu. This should be completed by December 2022 with further major updates happening yearly until Anons can build a PersoCom class robot with relative ease and affordability.
347 posts and 172 images omitted.
>>23238 Because polygons are triangles. When you delete vertices you're left with pointy triangles. :^)
>>23391 Neat. Thanks.
Open file (64.61 KB 715x1365 New.jpg)
>>23393 New one, going to be Copyleft once finished.
Seems its time for a new thread Kiwi
NEW THREAD NEW THREAD NEW THREAD >>29219 >>29219 >>29219 >>29219 >>29219 NEW THREAD NEW THREAD NEW THREAD

/robowaifu/meta-7: Hanging down at 7-eleven Chobitsu 02/18/2023 (Sat) 11:00:31 No.20356 [Reply] [Last]
/meta, offtopic, & QTDDTOT General /robowaifu/ team survey (please reply ITT) (>>15486) >--- Mini-FAQ >A few hand-picked posts on various /robowaifu/-related topics -Why is keeping mass (weight) low so important? (>>4313) -How to get started with AI/ML for beginners (>>18306) -"The Big 4" things we need to solve here (>>15182) -HOW TO SOLVE IT (>>4143) -Why we exist on an imageboard, and not some other forum platform (>>15638, >>17937) -This is madness! You can't possibly succeed, so why even bother? (>>20208) -All AI programming is done in Python. Why are you using C++ here? (>>21057, >>21091) -How to learn to program in C++ for robowaifus (>>18749)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/09/2023 (Sun) 12:12:18.
361 posts and 77 images omitted.
Here my current plan for AI hardware: - I ordered a used K80 with 2x12 GB recently, a used one of course, for 100$/95€ shipping included. It's an old GPU, only supported by older CUDA versions and might not run quantified models. Uses much energy, but it's two GPUs with 12GB each. I plan to pair this one after a while with a RTX3060 (12GB, 300€ used or 400€ new I think) in one home server. Context: https://technical.city/en/video/Tesla-K80m-vs-GeForce-RTX-3060 - That one, will then run my 12GB models. For fine tuning, or models which don't run on the K80, I would use the 3060. I don't know yet if I can somehow joint them together and use 3x12GB through the bus. It just seems to need some software support in these programs for running models at home. - I plan to use online services like Colab to learn about how to run these things, but have the K80 for more private data and learning how to do these things at home. - Then I'll get some SBCs, most likely Orange PI's, which can run small models of Whisper (speech regocnition). Also, another small home server with a Intel Arc380 (140-160€), which is fast enough to run the big and better model of Whisper at the speed of one fast human speaker. It does this quite energy efficient. These devices will not run anything else, for security reasons, and be connected to microphones which will be always on. The server will receive the audio through the SBCs from all rooms using the home network (likely on a VPN using Tinc). All of them will send the transcripts to some server in my network which can then decide how to respond. Most likely filtering first for which data is more sensitive than others. - Some small device, like a Raspi, will maybe handle responses based on AIML or using some small model. - Questions which don't contain private information might be send to OpenAI or another service. - The next step up will be getting a M40 (180€) and then a used RTX3090 (700-800€ right now I think), putting them in another home server at some point. Of course I might use this one for gaming till I get even the next GPU. These can handle the models which need 24 GB. The 3090 will do the fine tuning if I want to do that, since it has more power, while the M40 doesn't need as much energy. Context: https://technical.city/en/video/GeForce-RTX-3090-vs-Tesla-M40-24-GB - Then the next step might be getting a AMD XTX (1k-1.2k€) if it's supported well enough for AI by this time. I can use this one for gaming and then put the 3090 in a home server with the M40. If it's possible to combine cards using PCI express, then it might be interesting to think about getting another XTX later, and have 48GB vRAM. - But I hope that either Intel or AMD will come out with a prosumer or consumer card for AI at home, which is rather slow but has 48GB and is not too expensive. (If you buy K80 or M40 on Ebay make sure not to buy the 12GB versions by accident while only looking at the price. They aren't much cheaper. K80 should have 2x12GB and the M40 24GB.)
>>23346 I hope the K80 works for you. I was thinking of getting two but the support for them seems abysmal. They used to be $200 used before shipping. Colab isn't what it used to be either. The free version will boot you off in 30 minutes sometimes or a few hours into training with Pro unless you pay big for compute credits. You're much better off running your own JupyterLab notebook or renting an instance off vast.ai or runpod.io if you don't have access to a GPU.
>>23351 >support for them seems abysmal. I think you need old versions of the software, but I also remember some people taking care of that, to support old GPUs. I might need to compile some of it myself, though. I hope it works out, but the risk isn't very high. >They used to be $200 used before shipping. They're down to $60-70 before shipping now. Some recommend to go straight for the M40, which is much newer, but $120-130 before shipping. >JupyterLab notebook or renting an instance off vast.ai or runpod.io if you don't have access to a GPU. Right, I forgot about those while writing this.
>>23346 Good luck, NoidoDev!
NEW THREAD NEW THREAD NEW THREAD (>>23415) (>>23415) (>>23415) (>>23415) (>>23415) NEW THREAD NEW THREAD NEW THREAD

Open file (3.22 MB 3120x4160 Hello_Anons.jpg)
Sophie Bot STL Files Uploaded Robowaifu enthusiast 07/15/2020 (Wed) 20:08:20 No.4198 [Reply]
I need to sort out her CAD files more before uploading them, but the .STLs are ready. Link to Google Drive shared folder: https://drive.google.com/drive/u/0/folders/1xWilMfWDZnrt30E1Uw7hlWe6JmaigKQF
15 posts and 2 images omitted.
not sure what this link was, but without any context and using a url shortener im assuming its cp if this was something on topic, my apologies, but with all these cp bots trying to advertise here we have to be careful
Edited last time by gator on 06/06/2023 (Tue) 15:02:46.
>>22974 Its literately google drive
>>22974 I take it you rm'd a link for us? That's fine if it was suspicious looking, thanks! :^) So, if you were a legitimate poster from our board, please at least explain what a link is, if it's otherwise unclear to an uninitiate. Thanks. >>22975 >Its literately google drive Not having seen it, I can't confirm this one way or another. But I'm uncertain that 'it's G*ogle' is a solid validation Anon.
>>22975 If it was, my bad, since it was behind a link shortener I couldn't tell. Just a single line of text explaining what it was would have been enough for me to tell it was human though. >>22978 Yeah just a single link run through a link shortener, many of which we've outright filtered at this point simply because of how badly the cp posters abuse them. While we obviously won't ban link shorteners, if you're gonna use them making clear it's posted by a human is a good idea, since otherwise it looks nearly identical to the cp bots.
Edited last time by gator on 06/06/2023 (Tue) 22:30:43.
>>22979 Got it. Thanks Gator. :^)

Embedded Programming Group Learning Thread 001 Robowaifu Technician 09/18/2019 (Wed) 03:48:17 No.367 [Reply] [Last]
Embedded Programming Group Learning Thread 001

Greetings robowaifufags.
As promised in the meta thread, this is the first installment in a series of threads where we work together on mastering the basics of embedded programming, starting with a popular, beginner-friendly AVR 8-bit microcontroller, programming it in C on linux.

>why work together on learning and making small projects that build up to the basis of a complete robot control system instead of just posting links to random microcontrollers, popular science robot articles, and coding tutorials and pretending we're helping while cheerleading and hoping others will do something so we don't have to?
Because, dumbass, noone else is going to do it. You know why in emergency response training they teach you to, instead of yelling "somebody call an ambulance!," you should always point to or grab someone and tell that person to do it? Because everyone assumes someone else will do it, and in the end, noone does. Well, I'm talking to YOU now. Yeah, you. Buy about 20 USD worth of hardware and follow the fuck along. We're starting from zero, and I will be aiming this at people with no programming or electronics background.

>I suppose I could get off my ass and learn enough to contribute something. I mean, after all, if all of us work together we can totally build a robowaifu in no time, right?
No, the final goal of these threads is not a completed robowaifu. That's ridiculous. What we will do though, by hands-on tackling many of the problems facing robot development today, is gain practical and useful knowledge of embedding programming as well as a more grounded perspective on things.

>so we're just going to be blinking a bunch of LEDs and shit? lame.
Not quite. We will try to cover everything embedded here: basic I/O, serial communications, servo/motor control, sensor interfacing, analog/digital conversion, pulse-width modulation, timers, interrupts, I2C, SPI, microcontroller-PC interfacing, wireless communications, and more.
125 posts and 16 images omitted.
>>22890 >pages are now execute only or no execute. For the uninitiate you could say that this helps keep corrupt (ie, 'hacked') code from executing. So Nagisa, off-topic; but what do you think would be involved in a practical sense of creating a robowaifu system based on OpenBSD? Remember that we have several hard-real-time constraints (though most isn't under this restriction). By this question I mean primarily her onboard systems, not just a home server setup.
>>22891 OpenBSD is the worst OS for real time among the ones I've used, its task scheduler has really bad fairness guarantees and big locks in the kernel can cause most of the kernel's functionality to block while one program uses it. The audio system defaults to 160ms latency and still gets audio drops, on Gentoo Linux I could get ~17-19ms with ALSA and no realtime tweaking. We all have much to gain from portability though. OpenBSD's strong memory protections can catch memory bugs that go unnoticed on every other OS. And while doing that, it's still fast enough that you can actually run your program and test it, you can't use e.g. Valgrind on a typical video game because then it will run at sub-1fps. OpenBSD's pthreads implementation catches destroying mutexes with waiters, mpv has that bug all over, Linux libcs don't do this. This goes for other platforms too, for instance, the diet libc for Linux warns when you use a libc function that makes binaries large, it's good for when you're optimizing binary sizes. I've fixed bugs in programs that I found because I ported the program to MSVC and Microsoft's compiler correctly warned where no other compiler warned.
I'm going to make the flashing leds either tomorrow or the day after tomorrow again.
>>22892 Thanks Anon! Yes that makes sense about realtime. I'm sure we'll figure things out in the end, but r/n it's a big giant puzzle. >We all have much to gain from portability though. Excellent point. It's certainly something to strive for in all our code, to the extent feasible. Certainly during R&D prototyping, I'd say it's a high priority to attempt testing on a wide array of systems. >I ported the program to MSVC and Microsoft's compiler correctly warned where no other compiler warned. They do have a really good debugger system. Ofc some would claim they needed to heh. :^)
>>22895 Please let us know how it goes Anon! :^)

Robowaifu references Anonymous 09/09/2019 (Mon) 00:09:49 No.1 [Reply] [Last]
My favorite robowaifu is Chii. I'd like to see yours.
114 posts and 116 images omitted.
>>21953 There's an amazing variety of cute Emmys!
>>21954 there really is
>>22450 Second image appears to be from a different artist?

Open file (1.08 MB 1978x802 IMG_20210725_103858.jpg)
Bot Shitposting Bread Robowaifu Technician 07/27/2021 (Tue) 09:59:33 No.11754 [Reply] [Last]
M boy need so many booboo why not just give them otherwise it ll explode like the old chinese emperor or something not getting involved going away giving up some things,trash and whatnot >=== -add thread subject
Edited last time by Chobitsu on 07/27/2021 (Tue) 12:26:28.
78 posts and 10 images omitted.
>>22238 >My GPU is busted Bummer. Really sorry to hear that. Can we somehow set up a way to all chip in to help get you a good replacement do you think? You're one of our best AI researchers after all!!
>>22243 I appreciate any help I can get. I just setup a Patreon: https://twitter.com/robowaifudev/status/1653190581580107776 There's a Monero address on the bottom of the about page if anyone is concerned about anonymity
>>22238 >>22257 >8x NVIDIA A100 80 GB 240 1800 GiB 20 TiB $12.00 / hr https://lambdalabs.com/service/gpu-cloud
>>22344 A100 is unnecessary. With OpenDelta modified with gradient checkpointing support, just need RTX 3070s which are $0.10/hr on vast.ai or $0.12/hr for 3060s or $0.18/hr for 3080s. Partition the data (ideally one task per instance), spin up multiple instances and merge the weights after. Optionally, spend some time weighting them accordingly. I don't have code for it yet but it'd be possible to optimize the merge weights to some training data. Have a lot of stuff to do but on Monday I'll clean up and post the training code I made for finetuning 2.7B models on toasters with only 6 GB if anyone wants to give it a shot.
>>22349 >Have a lot of stuff to do but on Monday I'll clean up and post the training code I made for finetuning 2.7B models on toasters with only 6 GB if anyone wants to give it a shot. That sounds excellent Anon! Please do so.

R&D General Robowaifu Technician 09/10/2019 (Tue) 06:58:26 No.83 [Reply] [Last]
This is a thread to discuss smaller waifu building problems, solutions, proposals and questions that don't warrant a thread. Keep it technical. I'll start.

Liquid battery and cooling in one
Having a single "artificial blood" system for liquid cooling and power storage would eliminate the need for a vulnerable solid state battery, eliminate the need for a separate cooling system, and solve the problem of extending those systems to extremities.
I have heard of flow batteries, you'd just need to use a pair of liquids that's safe enough and not too sensitive to changes in temperature.
This one looks like it fits the bill. The downside is that your waifu would essentially be running on herbicide. (though from what I gather, it's in soluble salt form and thus less dangerous than the usual variety)
https://www.seas.harvard.edu/news/2017/02/long-lasting-flow-battery-could-run-for-more-than-decade-with-minimum-upkeep

How close are we to creating artificial muscles? And what's the second best option?
Muscles are perfect at what they do; they're powerful, compact, efficient, they carry their own weight, they aren't dependent on remote parts of the system, they can be controlled precisely, and they can perform many roles depending on their layout alone.
We could grow actual organic muscles for this purpose already but that's just fucking gross, and you'd need a lot of extra bloat to maintain them.
What we need are strands of whatever that can contract using electrical energy. Piezo does the trick at small scales, but would it be enough to match the real thing? There have been attempts, but nothing concrete so far.
What are some examples of technology that one could currently use instead?

High level and low level intelligence emulation
I've noticed a pattern in programs that emulate other computing hardware.
The first emulators that do the job at acceptable speeds are always the ones that use hacks and shortcuts to get the job done.
It comes down to a tradeoff. Analyzing and recompiling or reinterpreting the code itself on a more abstract level will introduce errors, but it is a magnitude of order more efficient than simulating every part of the circuitry down to each cycle. This is why a relatively high level emulator of a 6th gen video game console has close system requirements to a cycle-accurate emulator of the SNES.
Now, I want to present an analogy here. If training neural networks for every damn thing and trying to blindly replicate an organic system is akin to accurately emulating every logic gate in a circuit, what are some shortcuts we could take?
It is commonly repeated that a human brain has immense computing power, but this assumption is based just on the amount of neurons observed, and it's likely that most of them probably have nothing to do with intelligence or consciousness. If we trim those, the estimated computing power would drop to a more reasonable level. In addition, our computers just aren't built for doing things like neural systems do. They're better at some things, and worse at others. If we can do something in a digital way instead of trying to simulate an analog circuit doing the same thing, that's more computing power that we could save, possibly bridging the gap way earlier than we expected to.
The most obvious way to handle this would be doing as many mundane processing and hardware control tasks as possible in an optimized, digital way, and then using a GPU or another kind of circuit altogether to handle the magical "frontal lobe" part, so to speak.
359 posts and 146 images omitted.
>>22178 Sorry, but I can't work on this. Maybe in two or three weeks.
>>22178 >>22288 That's really good advice Noidodev, it would make for a really good new OP. Maybe the original OP will return in the meantime, but if not, then by all means give it a shot if you're willing! Cheers. :^)
>>22286 That might work. My goal would be making a non electric one. I would need to figure a way to make a clock work movement with enough torque to move the pump and install the self winding system used on rolex for when you move your watch so that it self winds every time robot chan moves.
New thread New thread New thread (>>24152) (>>24152) (>>24152) (>>24152) (>>24152) New thread New thread New thread
>>22032 >fibergrid That has got to be one of the most brilliant ideas ever. You could use a cheap ESP32 CAM Camera Module Kit. They sell them with cameras on them already for $10 USD. Then you have up to 1622×1200 sensor channels. WOW! what a great idea. Do them in a sort of X'Y' grid and then you have what part of a finger and how far up. You could use this for position sensors also. Have a rotating bump or wheel press on an array of fibers. So with a little work you could have every single touch sensor and position sensor in one $10 camera. Some ideas are so good they are just...stupendous and this is one.

Report/Delete/Moderation Forms
Delete
Report