/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

Sorry for the delays in the BBB plan. An update will be issued in the thread soon in late August. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


“In the confrontation between the stream and the rock, the stream always wins- not through strength but by perseverance.” -t. H. Jackson Brown


3D printer resources Robowaifu Technician 09/11/2019 (Wed) 01:08:12 No.94 [Reply] [Last]
Cheap and easy 3D printing is vital for a cottage industry making custom robowaifus. Please post good resources on 3D printing.

www.3dprinter.net/
https://archive.is/YdvXj
233 posts and 43 images omitted.
>>31683 You have to scroll down and around. What they are doing is the page you land on for the A1-mini has the package deal of the A1-mini + a four color printing optional accessory. They have a retarded number of scripts on the page and you have to scroll around to find the $200 A1-mini alone. Maybe this link will work, https://us.store.bambulab.com/products/a1-mini?variant=41513493627016 The extra colors are not a bad deal at all. In fact a huge bargain. I could see having an additional color being an asset. Especially if you use one color for base and then a water soluble filament to make voids. One use for the multi-color is to to print spots with the .2 mm nozzle. So you print a translucent white but below. or next to, print various shades of red or pink. Like magazine printing from a distance you have a mixing of the colors to make shades. So with a white, black or brown, red or pink, maybe a yellow for Asians you could have realistic skin tones. If you can at all afford it I would recommend the multi-color kit, even if you see no use for it immediately. I read they were difficult to acquire if you don't get them together. But you can get around with just one color. My sweet spot for buying these is $200, maybe $250. Look at this video from a long time 3D printer user on the Bambu A1-mini who has used all sorts of printers. It's good and look at the comments. https://www.youtube.com/watch?v=vBQ-QfcY3Qs I've been also looking at resin printers. I don't know how good these are. Brief reviews seem to be ok. The specs though are extraordinary. Here's a list I made. Egloo Saturn 2-4K

Message too long. Click here to view full text.

I can't help but wonder if you couldn't make a .1 mm or .05 mm nozzle, maybe even smaller, and slowly, ever so slowly print with the Bambu to get super accurate prints? I see no "physical" reason this could not be done. I don;t know what the ultimate stepper resolution is. I suspect it's high at low speeds as the stepper uses difference wave forms to create partial steps, I think. I know this can be done. Bambu, according to users, appears to be responsive to customer request and supposedly has so far been constantly updating their older printers software. I think it could be a great selling point to have this super resolution mode as an option. Maybe charge more for nozzles.
Cross link from the CNC thread: >>31756
It seems someone is working on a consumer grade SLS printer. https://www.youtube.com/watch?v=OQ6DYJtn7Bw It seems interesting, but at it's current backer price of ~$3700 USD I don't know if it's exactly worth it. The machinability of printed parts is what catches my eye the most.
>>31805 >consumer grade SLS printer. That would be wonderful. Thanks for the update, Anon! Cheers.

Open file (353.55 KB 600x338 PersonalLimit.png)
Open file (21.52 KB 417x480 ReimuACute.jpg)
Open file (281.56 KB 1280x1010 ScaleForInspiration.jpg)
Open file (141.70 KB 1280x960 Joke.jpg)
Minimum wafiu Kiwi 10/15/2021 (Fri) 18:34:51 No.13648 [Reply]
Minimum viable waifu. In this thread, we'll discuss what our minimums for waifus are. Be it software, hardware, physical appearance, etc. This will help us focus in on what are the minimum goals we need to achieve as our first steps. For me, I want a waifu that will be just tall enough to hug (about 1.3 m), able to follow me around and have conversations with, will follow basic commands like going to designated spots at designated times, and look like picrel.
23 posts and 10 images omitted.
>>31593 >>31593 >you could sell the hell out of them. Indeed. I plan to be one of several billionaires selling robowaifus in the future. Also, hopefully every.single.OG. Anon here will do the same. After all everything will be free, unfettered, & open (at least insofar as my Model A contributions to this domain are concerned). The more the merrier! :^) >The kit would have not be too difficult. It shouldn't take 500 hours to build. Maybe 50-60. Yes. Simple assembly approaches will be vital for initial success. Thankfully individuals like Will Cogley are thinking along similar lines : (>>31472). I recommend every Anon here do the same with their design-thinking. >Send the kits with maid dresses and catgrill ears to really piss them off.* FTFY Anon. :DDD >=== -prose edit
Edited last time by Chobitsu on 06/16/2024 (Sun) 05:08:44.
>>31595 I decided to do some sex doll searches. It looks like they run about $1,500-$2,500 with a few at $3,000. The majority of higher quality seem to hover around $2,000 or a few hundred more, so as I , and many others speculated, the sweet spot is likely $2,000. I think you could sell a walking talking, uhh... "action oriented", but limited waifu with open source AI for $2,500 but if you could profit and make them for $2,000 I doubt you could keep up with demand. I suggest a way around the law would be to sell the muscles, skeleton, processor and skin in a kit BUT ship the parts separately and do not sell any sort of orifices except maybe a mouth. But there should be adjustable areas for insertion of unmentionables. Possibly you could sell these but on another site with different accounts, companies, etc. To sell you will probably have to have a very reduced motherboard,memory and process, BUT a fast built in wifi would be necessary and software to link securely with the clients computer. They could then upgrade either the onboard and/or their own personal computer to amplify interaction. Wifi should be fine for voice and general direction, moving about. I think it would be plenty of bandwidth if you used "gross" movement commands from the main computer while allowing the onboard motherboard to handle moving the body around.
>>31799 Thanks for the research, Anon. >"action oriented" As I've mentioned several times through the years here, I won't be selling any snu-snu-enabled robowaifu kits (nor will my eventual company interest), and I still maintain that every Anon here should follow that same model for their own robowaifu systems for sale. 3rd-party 'ladybits' kit manufacturers will surely pop up by the dozens once the robowaifu industry effects kick in. At most, simply accomodate their likely appearances is my advice to anons here. >home server Yes, I've already begun laying the foundations for all this with, well, RW Foundations : (>>14409) . There's a Dollhouse that fully-assembled versions will be shipped-in/stored-at-home-in. It has associated proxy/firewall/DMZ Dollnet electronics to allow for offlined/nearlined Internet access -- with physical lockout -- built right in. >"Sumomo, find good bathing software please." This will be a great start for privacy/safety/security (>>10000) for Model A robowaifus (even for completely-networking-inexperienced anons), and will also easily provide the secured interface for Anon's other, offlined server rack(s). Great ideas, Grommet. Thanks for the guidance! Cheers. :^) >=== -prose edit
Edited last time by Chobitsu on 06/27/2024 (Thu) 07:09:01.
>>31800 There is a surprising amount of prefab, powered fun bits already available. Given that one would need some electronic experience to do maintenance on their on robowaifu it wouldn't be too far of a stretch to figure out how to integrate accessories of a lewd nature.
>>31803 Yes, I think you're right Anon. Given that a typical robowaifu kit will likely require 50h - 100h of assembly time (including basic soldering), then such mods shouldn't be too challenging for a robowaifuist anon IMO. Cheers. :^)

NLP General Robowaifu Technician 09/10/2019 (Tue) 05:56:12 No.77 [Reply]
AI Natural Language Processing general thread

>"Natural language processing is a field of computer science, artificial intelligence, and computational linguistics concerned with the interactions between computers and human (natural) languages."
en.wikipedia.org/wiki/Natural_language_processing
https://archive.is/OX9IF

>Computing Machinery and Intelligence
en.wikipedia.org/wiki/Computing_Machinery_and_Intelligence
https://archive.is/pUaq4

m.mind.oxfordjournals.org/content/LIX/236/433.full.pdf
45 posts and 13 images omitted.
>>24928 Noted and thanks for your input as well. It seems my post was already edited, but I will keep formatting in mind from here on out.
>>24923 If you create a little program that can be called in Linux, it doesn't matter to me much what language you use. I've heard only positive things about Lua. Though Python will get much faster with Mojo. Whatever, I prefer having rather slow code over having no code. Posting on topic resources relating to other languages than Python in this thread would be good, but if it becomes a language comparison (war), then it's better to switch over to the thread about programming languages: >>128
>Steven Pinker, author of The Language Instinct: How the Mind Creates Language (1994) and co-founder of the Center for Cognitive Science, discusses his theory that language is a human instinct and not a human invention. Pinker defines language as ambiguous, and he believes that this ambiguity leads to the separation of words, meanings, and thoughts. He explores examples of what does and does not qualify as language, and demonstrates differences in sentence structure, dialects, pronouns, and meanings. The lecture concludes with audience questions. https://youtu.be/eOUKcAFa_HQ It's difficult to make notes while listening to something and not having something to write at hand, but I try to do that, since otherwise I forget the important parts. Some takeaways for our use case here: - important to realize that not all our thoughts are based on words but some come for example from imagine an image. - words are just suggestions that call trains of thought to mind, which need to be interpolated. There are assumed premises which are not expressed in the language itself. - special faculty of the mind, not general intelligence. - reading and writing are separate. - humans have some inborn way to create a language, children seem to reinvent the language based on what they hear and interactions. Learn a sense of how the grammar works and then go from there. - meanings and thoughts are separate from words - high school seniors have 60k words in their dictionary. - the order in phrases is important to convey the meaning (e.g. who did what to whom). - 100 trillion million sentences with 20 words or less. - languages are infinitely big.
>Top 10 most cited and influential papers in the history of NLP >In this video, I cover the top ten most cited and influential papers in the history of natural language processing, ranked by the number of Google Scholar citations. Some of these papers are new, while others are quite old. Not all of them use neural networks, but each one has made a significant impact in the field. 0:43 - Transformer (2017) 1:27 - LSTM (1997) 2:31 - BERT (2019) 3:17 - LDA (2003) 4:11 - Word2Vec (2013) 5:04 - GLoVE (2014) 5:54 - Encoder-decoder (2014) 6:46 - Attention (2015) 8:06 - BLEU (2002) 8:59 - Encoder-decoder (2014) 9:32 - WordNet (1995) >Papers referenced: 1. "Attention Is All You Need" by Ashish Vaswani et al. (2017)

Message too long. Click here to view full text.

>>30046 This is a great comment thanks for it.

Waifu Robotics Project Dump Robowaifu Technician 09/18/2019 (Wed) 03:45:02 No.366 [Reply] [Last]
Edited last time by rw_bumpbot on 05/25/2020 (Mon) 04:54:42.
272 posts and 218 images omitted.
Open file (462.80 KB 1024x447 Android complete v0.5.png)
Open file (321.09 KB 1024x987 Android complete v0.4_2.png)
Open file (162.63 KB 1024x576 Android complete v0.3.png)
Open file (193.30 KB 768x1024 Android v0.2.jpg)
Open file (315.16 KB 501x1024 Android v0.1.png)
Another single builder who is concentrating on legs and feet. Their thingiverse page has several iterations of development, #s 4 and 5 in fusion 360 format. Some have links to videos. https://www.thingiverse.com/jacky0815/designs
>>28970 Hmm, I thought I would've posted this already. But yeah, it was in the humanoid robots video thread: >>25463
>>28979 I did search "android" and the designer's name before posting but neither showed this project, and still don't. Oh well.
Open file (178.23 KB 1655x1159 Smol.jpg)
>>31550 Thanks, I re-posted it also in the thread for bi-pedal locomotion: >>31754

Dolls, Dollmaking Techniques, and Their Use for Robowaifu Construction Greentext anon 01/16/2024 (Tue) 05:14:05 No.28514 [Reply]
Dollmaking is an ancient art, dating back as far as recorded history. Throughout these many years, dollmakers have invented and innovated at a steady pace, making each doll just a little bit better than the last while still leaning on tried-and-true techniques. This thread is here to discuss those techniques, and other relevant aspects of the doll world (including mannequins and marionettes) which we can apply to make our own waifus. One common example most of you may have seen already is the ball-jointed doll. However, there is more than one type. Higher-end BJDs commonly use elastic string to bind limbs to sockets in a method not too dissimilar to how tendons link our bones. There are also peg linked joints. where you (usually) have two hemispheres connected to eachother and connecting two seperate limb parts with pegs. There are also double-joints which work on the same basic principle. The attached images show both principles. Simple hinges are also quite the staple. They often have varying levels of tightness, depending on the application. A marionette, for instance, will typically have very loose hinges which allow for great ease of movement via user-operated strings. A system like that is quite easy to motorize, though one would want to consider how to get the limbs to stay in position without constant motor input. On the more complex end of things, some dolls feature endoskeletons, which allow for more realistic sometimes and complex poses. However, in addition to being more complex, these are much more expensive, fragile, and difficult to repair. While some of these things could be solved here, I doubt that a perfect solution is realistic at this time. It's good to keep in mind, though. Outside of basic frame components, there are also the details, such as the eyes. Glass eyes have the interesting effect of "following" the viewer without requiring any motors or electronics. Weight and cost are both important considerations, however, especially for a full-size waifu with anime doe eyes. Resin eyes are also quite popular, as ithey're easier to make at home. Wigs, of course, are also important to consider. I assume everyone here will want their waifus to have hair, and said hair will be easier to install (and detach for cleaning) if treated as a seperate component. They're usually kept in place with simple elastic, and I imagine it'd be easy to introduce magnets into the equation for better stability. What do you guys think about these concepts, and what do you have to add? It's a vast field, with plenty of potential uses for waifu making. Concepts from doll-adjacent items, such as action figures, plushbots, puppets, etcetera are also welcome for discussion.
20 posts and 8 images omitted.
>>30531 Okay, do you mean more expensive than the materials in >>30524? I have no idea. Human hair is of course the other alternative. It would be interesting to know what a pro's and con's of each material are. But it's not urgent to me. Generally the work being put into it if it's not a wig but part of the skull plate will be the biggest cost or effort.
>>30600 Yes though I havent checked prices for everything individually. Presumably at times the price difference is small between raw material and something already dyed, combed, flat ironed and fixed into pieces to attach or a weave while other times the difference could be magnitudes more. Human hair is quite expensive where top quality human hair of long length can land someone thousands of USD....unless you want to just hang out at a salon and ask for random women's hair lol Generally speaking natural materials often have antimicrobial properties. Pictured is ramie.
Glad to see the doll thread is getting some activity, Greentext anon & all. Cheers! :^)
I recently bought a BJD, and I'm really impressed with it actually. There's a lot of engineering that goes into how these dolls function. The one I have has double joints and the legs lock to differnt positions and are really stable when straight so she can stand up on her own. I'm thinking it would be really neat to put a Bluetooth speaker inside her head connect her to my phone and an AI running on it. Dolls definitely have the customization options I would eventually want to see from future robotwaifus. And having a smaller 1/4 scale doll is much easier to manage than a full size 60lb plus silicone doll. I think they make a good vessel for the current AI we have. Like a much cuter smart speaker.
>>31408 >There's a lot of engineering that goes into how these dolls function. This. Cool idea Anon, please keep us all up to date with the progress of your robowaifu speaker project, Anon. Cheers! :^)

Open file (522.71 KB 1920x1080 gen.png)
Nandroid Generator SoaringMoon 02/29/2024 (Thu) 13:54:14 No.30003 [Reply]
I made a generator to generate nandroid images. You can use it in browser, but a desktop version (that should easier to use), will be available. https://soaringmoon.itch.io/nandroid-generator Not very mobile friendly unfortunately, but it does run. I made a post about this already in another thread, but I wanted to make improvements and add features to the software. >If you have any suggestions or ideas other than custom color selection, which I am working on right now, let me know.
18 posts and 8 images omitted.
>>31447 >>31448 pretty neat, good work SoaringMoon
Fixed an issue with the desktop transparent image export blend mode.
Commemorative Wallpaper
Open file (7.49 MB 1920x1080 2024-06-06 19-26-31.mp4)
You can now customize eye shadow color.
Nice work SoaringMoon, please keep it up! Cheers. :^)

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply] [Last]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.

opencv.org/
https://archive.is/7dFuu

github.com/opencv/opencv
https://archive.is/PEFzq

www.robotshop.com/en/cameras-vision-sensors.html
https://archive.is/7ESmt
Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
118 posts and 53 images omitted.
>>30879 Zoom in to the whole in the centre. Looks like there is a circuit board under there. If one were to take it out of the frame it would require adding wires and attaching back to the circuit board I imagine.
>>30879 >>30880 I expect the physical positioning of the 3 camera components is tightly registered. Could be recalibrated I'm sure, but it would need to be done.
>>30879 >Depth Perception From what I know these systems work so that it knows the distance between the two cameras and this is part of the hardware. If you want to do this yourself then your system would need to know the distance. I think Kudan Slam is a software doing that: >>29937 and >>10646 >Kudan Visual SLAM >This tutorial tells you how to run a Kudan Visual SLAM (KdVisual) system using ROS 2 bags as the input containing data of a robot exploring an area https://amrdocs.intel.com/docs/2023.1.0/dev_guide/files/kudan-slam.html >The Camera Basics for Visual SLAM >“Simultaneous Localization and Mapping usually refer to a robot or a moving rigid body, equipped with a specific sensor, that estimates its motion and builds a model of the surrounding environment, without a priori information [2]. If the sensor referred to here is mainly a camera, it is called Visual SLAM.” https://www.kudan.io/blog/camera-basics-visual-slam/ >.... ideal frame rate ... 15 fps: for applications with robots that move at a speed of 1~2m/s >The broader the camera’s field of view, the more robust and accurate SLAM performance you can expect up to some point. >...the larger the dynamic range is, the better the SLAM performance. >... global shutter cameras are highly recommended for handheld, wearables, robotics, and vehicles applications. >Baseline is the distance between the two lenses of the stereo cameras. This specification is essential for use-cases involving Stereo SLAM using stereo cameras. >We defined Visual SLAM to use the camera as the sensor, but it can additionally fuse other sensors. >Based on our experience, frame skip/drop, noise in images, and IR projection are typical pitfalls to watch out. >Color image: Greyscale images suffice for most SLAM applications

Message too long. Click here to view full text.

Open file (225.52 KB 1252x902 kinectxie.jpg)
>>30877 The kinect was cheap at 12$ and I scaled it to the full sized robot head in gimp. I can use the main camera in the middle of aperture and the two projector/IR camera lenses as the eye shines. It won't look like this in the final robot head, but it will be positioned in this manner.
Will Cogley came out with a snap fit eye mechanism (no screws needed). > By removing ALL fasteners and using a 100% snap-fit assembly, assembly time is cut down 6 fold! Hopefully this design will also be more accessible if you struggle to get the right parts for my projects. If you don’t want to use my new PCB design (which admittedly is a work in progress) refer to [my previous design](https://www.notion.so/Simple-Eye-Mechanism-983e6cad7059410d9cb958e8c1c5b700?pvs=21) for electronics/wiring instructions. > If you do want to use the PCB, note that its still a work-in-progress. The design works although there is an issue with some holes being undersized. In theory the attached file is fixed but I’ve yet to test it myself to be 100% sure! https://youtu.be/uzPisRAmo2s https://nilheim-mechatronics.notion.site/Snap-fit-Eye-Mechanism-b88ae87ceae24d1ca942adf34750bf87

AI Software Robowaifu Technician 09/10/2019 (Tue) 07:04:21 No.85 [Reply] [Last]
A large amount of this board seems dedicated to hardware, what about the software end of the design spectrum, are there any good enough AI to use?

The only ones I know about offhand are TeaseAi and Personality Forge.
124 posts and 42 images omitted.
>>12067 Is this one more of the many theoretical questions here? When building something, solutions for such problems will present themselves. Why theorize about it? And to what extend? Or short answer: Conditionals. Like "if".
>>12069 >Is this one more of the many theoretical questions here? No. Allow me to get more specific. I have an OpenCV based code that can identify stuff (acutally, I just got that OakD thing ( https://www.kickstarter.com/projects/opencv/opencv-ai-kit ) and ran through the tutorials), and I have a really rudimentary chatbot software. When I've been trying to think through how to integrate the two, I get confused. For example, I could pipe the output of the OakD identification as chat into the chatbot subroutine, but then it will respond to _every_ stimulus or respond to visual stimulus in ways that really don't make sense.
>>12067 In my experience the simplest way to think about it is like a database. You give the database a query and it gives a response. That query could be text, video, audio, pose data or anything really and the same for the response. You just need data to train it on for what responses to give given certain queries. There was a post recently on multimodal learning with an existing transformer language model: >>11731 >>12079 With this for example you could output data from your OpenCV code and create an encoder that projects that data into the embedding space of the transformer model.
>>12086 Exactly what my brain needed. Thanks anon.
This looks really interesting to me. llamafile https://github.com/Mozilla-Ocho/llamafile A standalone open source AI that can be run on may platforms including Raspberry Pi. It also can use other AI's other than the ones available for download. https://hacks.mozilla.org/2024/04/llamafiles-progress-four-months-in/ The AI software is getting better and better, smaller and smaller and more useful for local PC's. I imagine there's some way to train this. It could be a great advance to have these small AI's and then intensively train them on narrow dedicated task that we need. And no I don't know exactly how to do this yet. The tech is evolving rapidly to do so.

Open file (99.41 KB 750x995 IMG_3203.jpeg)
Robowaifu Market Chuck 01/04/2023 (Wed) 14:07:33 No.18572 [Reply]
How would the robowaifu market theoretically function? Top of the line models would be very expensive, but the target demographic is poor with little income flow. It would be a hard and gradual process to replace supermodels that the wealthy have with robot wives, and a vast amount of anime supporters with wealth or status are seeking a conventional tradwife. Essentially, it’s a very high value commodity without a niche, so it would be hard for it to garner success as a product, and the intended audience would never receive their robowaifus. The robowaifu concept is excellent theoretically, but has no real avenue to thrive in practice. How could these issues be resolved? --- Threads related: >(Making money with AI and robowaifus, >>1642) >(Early Business Ideas, >>3119) >=== -add thread crosslinks
Edited last time by Chobitsu on 01/04/2023 (Wed) 23:42:21.
18 posts and 3 images omitted.
>>18572 >the target demographic is poor with little income flow the venn diagram of people who spend thousands of dollars on anime merch and people who'd buy the dolls is almost a circle
crosslink to a discussion in meta: >>19623
Open file (1.88 MB 3000x2191 1557058809176.jpg)
The potential for lawsuit from personal injury would be by far one of the biggest roadblocks to the commercialization of household robots like androids, even simple ones. I don't know why this isn't focused on more, to have an emerging technology as complex as that is a simple recipe for >Man brings Android home >Android accidentally bumps man down the stairs/cuts him with knife while turning around/trips him/has a servo malfunction/any number of other things >Man remembers all of those lawyer billboards for personal injury that he sees on the way to work every morning Maybe some won't but the opportunity will be too tempting for others, and lets not forget the malicious actors: >Woman brings Android home >Woman "accidentally" gets tripped by Android or better yet: >Woman brings Android home >Puts Android in room with priceless valuable >Android "trips" into valuable, destroying it >Sues for emotional damages There are simply too many variables to account for when robots are in a real world environment and ostensibly in close contact with people all of the time. I would love to start an android company, but the thought of this alone sends chills down my spine. Even if they are sold "As Is" I'm sure someone could spin it into a legal battle where even if you successfully defend yourself you still have to pay legal fees, and if there is a concentrated effort against a company with malicious actors they can just keep coming and poking you with lawyers until they have nickeled and dimed you to death. At least that's what I'm assuming would happen in America. You may argue that you could try to use onboard sensors like the cameras on the robot to prove its innocence but even then the purchaser can fake the accident so well that you couldn't disprove it or legitimately find a bug to exploit to make the robot genuinely make a mistake in an opportune circumstance.
>>31385 There usually is a reason the manuals for even the simplest appliance comes with a wall of text giving warnings, disclaimers, etc. I even once had a radio clock with a manual stating the warranty was void even for acts of god lmao
>>31385 Iron-clad user agreements are what you need.

Open file (1.76 MB 2560x1600 wp8232537.png)
Open file (427.25 KB 1920x1200 wp3421764.jpg)
Open file (368.53 KB 1680x1050 bRSM5J.jpg)
/robowaifu/meta-9: Wintertime will be sublime. Chobitsu Board owner 10/30/2023 (Mon) 00:42:15 No.26137 [Reply] [Last]
/meta, offtopic, & QTDDTOT >--- General /robowaifu/ team survey (please reply ITT) (>>15486) >--- Mini-FAQ >A few hand-picked posts on various /robowaifu/-related topics -Why is keeping mass (weight) low so important? (>>4313) -How to get started with AI/ML for beginners (>>18306) -"The Big 4" things we need to solve here (>>15182) -HOW TO SOLVE IT (>>4143) -Why we exist on an imageboard, and not some other forum platform (>>15638, >>17937) -This is madness! You can't possibly succeed, so why even bother? (>>20208, >>23969) -All AI programming is done in Python. So why are you using C & C++ here? (>>21057, >>21091, >>27167, >>29994)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/29/2024 (Thu) 06:43:57.
509 posts and 166 images omitted.
my substack is up and with two new submissions https://metaronin.substack.com
@All -My apologies for letting the /meta thread languish. I've been off in the mountains on a bit of a spirit quest with the Lord and others. Will rectify soon. :^) >>31382 Neat! Thanks Meta Ronin. I tried leaving you a nice comment, but your platform wouldn't accept one from me w/o an (((account))).
Open file (59.60 KB 540x540 OIG4 (1).jpeg)
>>31558 Glad to hear it!! This is definitely the time of year for that. I'm still trapped in suburbia for now but the ocean is nearby at least. Appreciate the time, just glad to hear you of all people read it, there will be more so wouldn't hurt to make an account or subscribe but I understand if you'd rather not give them your email or info. Fair enough!
>>31558 >Will rectify soon. :^) Lol, I can't post images on teh IB i'm the BO of. :D ROBI PLS TURN BACK ON TORFAG POSTING Till then, would some other Anon mind making the new /meta 10 thread please? >>31560 Nice work you've got talent Meta Ronin. I hope you may 'boot to our fiction bread?
Open file (32.74 KB 640x360 british robot.jpg)
/robowaifu/, what do you think of the most recent british election? What do you think the 2024 American election will be like?

Report/Delete/Moderation Forms
Delete
Report