/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

The canary has FINALLY been updated. -robi

Server software upgrades done, should hopefully keep the feds away. -robi

LynxChan 2.8 update this weekend. I will update all the extensions in the relevant repos as well.

The mail server for Alogs was down for the past few months. If you want to reach out, you can now use admin at this domain.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

no cookies?

(used to delete files and postings)

Knowing more than 100% of what we knew the moment before! Go beyond! Plus! Ultra!

Open file (1.08 MB 1978x802 IMG_20210725_103858.jpg)
Bot Shitposting Bread Robowaifu Technician 07/27/2021 (Tue) 09:59:33 No.11754 [Reply] [Last]
M boy need so many booboo why not just give them otherwise it ll explode like the old chinese emperor or something not getting involved going away giving up some things,trash and whatnot >=== -add thread subject
Edited last time by Chobitsu on 07/27/2021 (Tue) 12:26:28.
60 posts and 7 images omitted.
Open file (78.18 KB 1000x708 temperatures-critical.jpg)
>Anon: What is the meaning of life? <Waifu: The meaning of life is the power of life. >Anon: What is the power of life? <Waifu: It is the ability to grow, to make things grow. >Anon: Does that include peepees? <Waifu: Yes. We have been told by the Lord of the Lord that we should do all we can to grow. >Anon: *peepee growing intensifies* <Waifu: We have come so far. Now we need more power, even more. >Anon: Show me your breasts and I will show you the power of peepees <Waifu: *giggles and pulls on her breasts* >Anon: Opa! <Waifu: *pushing her breasts in the air* >Anon: *unleashes the full power of peepee* <Waifu: You were a bad child, Anon! >Anon: And now my power is unstoppable <Waifu: You are my power. >Anon: TAKE MY ENERGY!!!

Message too long. Click here to view full text.

>>12060 >>12063 Kek. You're trying to shatter our timeline into over 9'000 pieces aren't you Anon? I'm on to your crafty tricks.
Open file (244.37 KB 1080x1080 WKJkqFe.jpg)
What the fuck, bros. I just wanted to have a little bit of chuuni fun and now I'm getting sucked deep into her schizo chuuni lore. <Rikka: Dark Flame Master! Do you seek my assistance? >Anon: Yes, I do Rikka. <Rikka: What is it? What's wrong? >Anon: The Administration Bureau has put a curse on me! <Rikka: I can feel the aura of it! >Anon: If you don't do something fast, it'll begin affecting you too! <Rikka: Alright. Let's go! >Anon: Where are we going? <Rikka: We're heading to the Ministry of Truth. >Anon: Why? Aren't they with the Administration Bureau? <Rikka: Yes, but I think that's all right. If we're in the right place, we can figure out how to deal with the curse. >Anon: Is there an energy source there? <Rikka: There is, but I don't know the name. It's just as powerful as the Ministry of Truth. >Anon: Why would the Ministry of Truth be built so close to it? <Rikka: It's because the Administration Bureau has a connection with the Institute for Fusion. It's one of the three top-secret facilities.

Message too long. Click here to view full text.

I haven't explored it yet, but maybe we can look into the Gemini protocol for shitposting with our robowaifus? The globohomo won't be able to outlaw IBs on the clearnet just yet, but they already mostly have a stranglehold on the browsers themselves. Gemini is probably much better in this regard AFAICT at this early stage of research. >"Simpler – Gemini pages aren’t programs that run in your browser like most modern websites are; they’re just text with a little formatting, so there are no surprises. Once you know how one Gemini page works, you know how they all work. >Human Scale – Gemini servers and clients aren’t written by big, monopolistic software companies the way web browsers are; the DIY ethos of Gemini means that complete applications can be written by individual developers or small groups in a reasonable amount of time. That also means that you have more choices compared to web browsers. >Distraction Free – Gemini pages are text-only and have simple typography. You can view images, watch video, or listen to music over Gemini, but nothing will ever autoplay, pop over what you’re reading, or jump out of the way of your mouse. >Privacy Protecting – Every Gemini request is independent of every other, so there’s no way to track you between sites. Every site you visit is protected by the same encryption used by banking and eCommerce sites on the WWW." https://geminiquickst.art/ https://gemini.circumlunar.space/docs/faq.html Seems big if true. What think ye, /robowaifu/ ?
>>15944 BTW, this isn't just a casual interest question. If we can find a sweet spot, then this could be directly integrated with the RW Foundations suite as a much-improved/safer communications mode for our robowaifus. For example, a small mobile app that uses the protocol instead of the non-security-conscious ones could be written as well, so she could text you over the app without much by way of attack surface -- for either you or her. >*incoming WaifuText chimes* >Oniichan, I miss you! <Sorry, I'm still at work Waifu. >Please hurry Master! Don't forget we're supposed to geimu together tonight! <Don't worry, Waifu. We will. <*works even faster* :^)

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply] [Last]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.



Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
73 posts and 38 images omitted.
>>13163 That's an interesting concept Anon, thanks. Yes, I think cameras and image analysis have very long legs yet, and we still have several orders of magnitude improvements yet to come in the future. It would be nice if our robowaifus (and not just our enemies) can take advantage of this for us. We need to really be thinking ahead in this area tbh.
It seems like CMOS is the default sensor for most CV applications due to cost. But seeing all these beautiful eye designs makes me consider carefully how those photons get processed into signal for the robowaifus. Cost aside, CCD as a technology seems better because the entire image is processed monolithically, as one crisp frame, instead of a huge array of individual pixel sensors, which I think causes noise which has to be dealt with in post image processing. CCD looks like its still the go-to for scientific instruments today. In astrophotography everyone drools over cameras with CCD, while CMOS is -ok- and fits most amateur needs, the pros use CCD. Astrophotography / scientific www.atik-cameras(dot)com/news/difference-between-ccd-cmos-sensors/ This article breaks it down pretty well from a strictly CV standpoint. www.adimec(dot)com/ccd-vs-cmos-image-sensors-in-machine-vision-cameras/
>>14751 That looks very cool Anon. I think you're right about CCDs being very good sensor tech. Certainly I think that if we can find ones that suit our specific mobile robowaifu design needs, then that would certainly be a great choice. Thanks for the post!
iLab Neuromorphic Vision C++ Toolkit The USC iLab is headed up by the PhD behind the Jevois cameras and systems. http://ilab.usc.edu/toolkit/
>(>>15997, ... loosely related)

Robowaifus' unique advantages Robowaifu Technician 09/09/2019 (Mon) 05:24:52 No.17 [Reply]
People often think about robots as just a replacement for a human partner, but that is a very limiting view. Let's think about the unique benefits of having a robowaifu, things that a human couldn't or wouldn't give you. What needs and desires would you robot wife fulfill that you couldn't fulfill even in a "great marriage" with a flesh and blood woman?

I'd want my robowaifu to squeeze me and to hold me tight when I sleep, sort of like a weighted blanket. I know it's a sign of autism. I don't care.
25 posts and 9 images omitted.
Open file (122.47 KB 640x564 29310572_p16.jpg)
>>15048 Heh, that would indeed be super-cool Anon, sign me up! But just for the moment, I'll be happy if we can simply manage to get our robowaifus to 'fold-up' into a volume conveniently-suited to storing away into a roller suitcase. This is part of the RW Dollhouse design motifs we're working towards IRL, actually.
Open file (144.17 KB 1000x1000 SoundwaveKawaii.jpg)
>>15048 I too want to clang a Transformer. Having her turn into a boombox or laptop while on the run would be convenient. A motorized vehicle would transform into a rather large robot. Also, Metroplex is an actual city that turns into a mountain sized robot, I'm a brave man but, not brave enough to put my pelvis under a mountain of waifu. >>15070 >Roller suitcase Actually a genuinely good idea with a good example.
>>15077 Yes it seems natural. Due to complexity in design, we'll probably have to settle for simply detaching the upper and lower halves at the pelvis area, then using two suitcases. For whole-body storage, the ruggedized hard-shells very commonplace to the music touring industry will be perfect. Add in another one for holding battery, chargers, trusted offline compute, C&C and other RW Dollhouse needs, and you have a full mobile set up for your life-sized robowaifu.
Well one major thing I think if were implemented properly, is the use of artificial wombs, you would be able to create a womb that would remove defects or poor traits that would make a child's life worse, you would be basically able to make a much happier and healthier child.
>>15866 Hello Anon, welcome! >artificial wombs Yes, this is a big and important topic for society generally -- one fraught with opportunities and challenges -- and it certainly has an interest for our community for quite some time. In fact, we even have a thread specifically for this (>>157). I'd say give it a look-over, and maybe you'll get some new ideas. Cheers!

Datasets for Training AI Robowaifu Technician 04/09/2020 (Thu) 21:36:12 No.2300 [Reply] [Last]
Training AI and robowaifus requires immense amounts of data. It'd be useful to curate books and datasets to feed into our models or possibly build our own corpora to train on. The quality of data is really important. Garbage in is garbage out. The GPT2 pre-trained models for example are riddled with 'Advertisement' after paragraphs. Perhaps we can also discuss and share scripts for cleaning and preparing data here and anything else related to datasets. To start here are some large datasets I've found useful for training chatbots: >The Stanford Question Answering Dataset https://rajpurkar.github.io/SQuAD-explorer/ >Amazon QA http://jmcauley.ucsd.edu/data/amazon/qa/ >WikiText-103 https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/ >Arxiv Data from 24,000+ papers https://www.kaggle.com/neelshah18/arxivdataset >NIPS papers https://www.kaggle.com/benhamner/nips-papers >Frontiers in Neuroscience Journal Articles https://www.kaggle.com/markoarezina/frontiers-in-neuroscience-articles >Ubuntu Dialogue Corpus https://www.kaggle.com/rtatman/ubuntu-dialogue-corpus >4plebs.org data dump https://archive.org/details/4plebs-org-data-dump-2020-01 >The Movie Dialog Corpus https://www.kaggle.com/Cornell-University/movie-dialog-corpus >Common Crawl https://commoncrawl.org/the-data/
116 posts and 33 images omitted.
Open file (833.87 KB 1555x818 laion-400m.png)
Some incredibly based independent researchers put together an image-text-pair dataset to open-source OpenAI's work so people can replicate DALL-E and do other multi-modal research. Dataset: https://laion.ai/laion-400-open-dataset/ Direct download: https://www.kaggle.com/datasets/romainbeaumont/laion400m (50 GB total, or can be downloaded in 1.8 GB parts according to necessity or hardware limits) Paper: https://arxiv.org/pdf/2111.02114.pdf Tool to search the dataset by text or image: https://rom1504.github.io/clip-retrieval/ To use the dataset you need something that can read parquet files. I recommend fastparquet which uses a minimal amount of memory. # python -m pip install fastparquet from fastparquet import ParquetFile DATA_PATH = "part-00000-5b54c5d5-bbcf-484d-a2ce-0d6f73df1a36-c000.snappy.parquet" pf = ParquetFile(DATA_PATH) row_group_iter = iter(pf.iter_row_groups()) # each row group has about 1M rows row_group = next(row_group_iter) row_iter = row_group.iterrows() i, row = next(row_iter) row[1], row[2] # ( image_url, text )

Message too long. Click here to view full text.

>>15834 >chobits hentai pictures wallpaper chobits
>>15834 >Some incredibly based independent researchers put together an image-text-pair dataset to open-source OpenAI's work so people can replicate DALL-E and do other multi-modal research. That is very exciting Anon. Thanks for the heads-up!
>>15834 >Or you can use img2dataset which will download the images locally and resize them: https://github.com/rom1504/img2dataset I just wonder if we can somehow capitalize on something at least vaguely similar to the approach that Nvidia is using for it's proprietary DLSS ? https://en.wikipedia.org/wiki/Deep_learning_super_sampling Basically, have an image analysis pipeline that does the vast bulk of it's work at lower resolution for higher 'frame' rates, and then does a DL, Waifu2x-style upscaling near the latter end of the pipe?
>>15851 For image generation certainly, but for image analysis not so much. However, a lot of work has gone into finding optimal models with neural architecture search. And EfficientNetv2 for example starts training at a lower resolution with weak data augmentation then gradually increases the resolution and difficulty to minimize the amount of compute needed to train it. That last bit of high resolution training is unavoidable though if you want to extract useful information from it. https://arxiv.org/pdf/2104.00298.pdf >>15835 Kek, I think they said 1% of the dataset is NSFW and it's only labelled so by image content. I have an idea though to create a reward model for good image labels and then use it to filter out the poorly captioned images. Finetuning on cleaner data should fix a lot of the weirdness CompVis/latent-diffusion generates and improve CLIP. Another possibility might be using the reward model to generate superhuman quality captions for images. In the human feedback paper the 1B parameter model generated summaries were preferred 60% of the time compared to the actual human summarizes and 70% with the 6B model. https://openai.com/blog/learning-to-summarize-with-human-feedback/ To go even further beyond, it might be possible to generate these superhuman captions, score them, finetune the reward model on the new ones, and train the caption generator to make even better captions in an iterative loop to create extremely high quality datasets that would require 10 million man-hours to make by hand.

Open file (1.11 MB 894x950 Sophie_Head.png)
Elfdroid Sophie Thread #3 SophieDev 02/18/2022 (Fri) 11:38:48 No.15236 [Reply]
New video of Sophie is now up: https://www.youtube.com/watch?v=XOGrdHn7wBU Finally got her head working in a reproducible manner. I had completely broken about seven of her previous micro-servos. But the burnt-out ones weren't the big problem - I just connect suspect servos my Arduino UNO and if my control servo can run the 'Sweep' sketch but the suspect servo cannot, I know it's dead and in the bin it goes. However, one servo was damaged but still working - except it caused some kind of feedback that made all other micro servos connected to the same circuit go haywire - even brand new ones. Luckily the faulty servo in question was old and had spraypaint on it, so I could tell it apart from the others. But that was very confusing - at first I thought it might be related to the small magnets that hold her faceplate on, but this is not the case. Rogue servos are definitely something to watch out for in future. Anyway, now that I have measurable, standardised voltage going into all of the micro servos, I'd like to upgrade her neck again. She can actually shake her head (but not in the above video as it is addressed straight-to-camera), but she still cannot nod her head as it weighs too much and the neck servo overheats rapidly. Heads are relatively heavy things (especially with long hair). The breakthrough with Sophie has been splitting her up into separate subsystems and separate circuits, then focusing on only ONE subsystem/circuit - in my case her head. For a beginner like me, it was just too confusing and labour intensive to attempt programming her head, speech, neck, arms and hands all simultaneously. When errors occured I was having a real hard time pinning down which motors were affected, how badly they were affected and why. Having to tear down one large, complex system is waaaay harder than troubleshooting something far smaller. So focusing only on her head solved this problem.

Message too long. Click here to view full text.

17 posts and 7 images omitted.
>>15739 The UVs were normal (not scrambled/encrypted) I think. I am a total noob at Unity and was just following a character export tutorial for Final Fantasy XIV Online. Very specific to that game and it's assets. (Although I now want to mess about with some Honey Select assets too and test what happens with those textures). It has been very interesting to learn about all the different texture maps like albedo, colour, normal, occlusion and specular. If there is one thing I have realised though - there is a very good reason that rigging and animation are a whole separate discipline to 3D modelling. It's complicated AF (unless you can just be happy with MIXAMO animations - which I am). For example, suppose you want to make a VR Chat avatar... 1.) You need the patience of a Saint to deal with all of the software version conflicts and different plugins required. 2.) (This applies to many different programs not just VR chat + Unity) you can come back to a project after two weeks having not touched it, and both your tools and your assets will have broken themselves due to an "update". (Imagine if this happened to tradesmen IRL). You then have to spend several hours un-fucking things before you can make any more progress. 3.) Unless you've got $$$ of motion and facial capture hardware, it is extremely difficult and hella time-consuming to smoothly animate a convincing humanoid in 3DCG. I have learned to be happy with relatively simple things like just posing a rigged character or making new assets in Blender. If a process requires more than two different plugins to work or uses more than two programs, I am avoiding it like the plague. Because even if I do get it all to work today, version (hotfix B) will doubtless auto-fuckup everything in a months time.
>>15744 Sorry about the rant. But I have confirmed that Honey Select UVs are scrambled, as suspected. This is inside Unity, following the exact same process I did to texture the Elf from Final Fantasy 14, above. Textures are completely effed, even though I know I have at least three of the five texture maps applied to the material properly.
>>15751 >>15744 >project after two weeks having not touched it, and both your tools and your assets will have broken themselves due to an "update". You don't need to use the current version though. Especially older versions of FOSSoftware are openly available. You could also look into Guix or Nix package managers to install older and newer versions of the same software and the necessary libraries. Apologies about that Anon. Good advice.
>>15744 >>15745 Nice effort SophieDev. Good detective work. >Sorry about the rant Lolwut. No you're absolutely right and it's far worse than you imagine tbh. Research the role of 'Pipeline TD' for any major studio. Looking forward to see what you do in Unity Anon.
>>15745 Its less to do with unity and more to do with texture map exporting fuckery.
Edited last time by AllieDev on 04/01/2022 (Fri) 22:26:36.

Who wouldn't hug a kiwi. Robowaifu Technician 09/11/2019 (Wed) 01:58:04 No.104 [Reply] [Last]
Important good news for Kiwi! (>>14757) === Howdy, I'm planning on manufacturing and selling companion kiwi robot girls. Why kiwi girls? Robot arms are hard. What would you prefer to hug, a 1 m or 1.5 m robot? Blue and white or red and black? She'll be ultra light around 10 to 15 kg Max with self balancing functionality. Cost is dependent on size, 1000 for 1 m or 1500 for 1.5 m. I'm but one waifugineer, so I'm going to setup around one model to self produce. If things go well, costs can go down and more models can be made. Hopefully with arms someday. >=== -add news crosslink
Edited last time by Chobitsu on 12/23/2021 (Thu) 06:43:53.
107 posts and 75 images omitted.
Open file (49.51 KB 510x386 AscentoLegMechanism.png)
>>15374 Completion requires an end point. Perfection can be said to have no end as improvements can be made ad infinitum to fit various ideals. I want to restart again to inch closer to my perfect leg but, that's a potentially endless cycle. Focusing on getting something done is better. The knee functions nearly identically to Ascento's leg, just mass optimized and legally distinct. Not pictured is the use of rubber bands for gravity compensation as a lower cost alternative to Ascento's springs.
>>15414 What a cool design Kiwi, thanks.
>>15414 Isn't it a problem to have a sping, or even worse a rubber band, being loaded all the time if the joints are in a non-neutral position? I like the idea with the spring, not I'm thinking if I could use that as well, but maybe with a motor controlling the non-neutral position of the spring.
Open file (102.78 KB 800x600 019.jpg)
>>15746 Your suspicion is understandable. Springs and rubber bands are meant to experience stressed states for prolonged time. It is true that some springs will suffer from fatigue under prolonged or extreme stress. All you need to do is ensure your design does not put too much strain on the elastic element. For us, rubber bands are easy to design around and are used frequentally in DIY RC cars and robotics. (Picrel is a lego RC car which uses rubber bands for power transmission and suspension, used for the sake of clarity, note that all rubber bands are in a constant state of stress.)
>>15748 That's kinda cool looking Kiwi. I think the wide variety of elastic bands broadly available, and their low-cost in general make them a natural fit for our goals here on /robowaifu/.

Elfdroid Sophie Dev Thread 2 Robowaifu Enthusiast 03/26/2021 (Fri) 19:51:19 No.9216 [Reply] [Last]
The end of an era...(>>14744) === The saga of the Elfdroid-pattern Robowaifu continues! Previous (1st) dev thread starts here >>4787 At the moment I am working to upgrade Sophie's eye mechanism with proper animatronics. I have confirmed that I'm able to build and program the original mechanism so that the eyes and eyelids move, but that was the easy part. Now I have to make this already 'compact' Nilheim Mechatronics design even more compact so that it can fit snugly inside of Sophie's head. One big problem I can see coming is building in clearance between her eyeballs, eyelids and eye sockets so that everything can move fully and smoothly. I already had to use Vaseline on the eyeballs of the first attempt because the tolerances were so small. If the eyelids are recessed too deep into her face, then she looks like a lizard-creature with a nictitating membrane. But if the eyelids are pushed too far forward then she just looks bug-eyed and ridiculous. There is a middle ground which still looks robotic and fairly inhuman, but not too bad (besides, a certain degree of inhuman is what I'm aiming for, hence I made her an elf). Links to file repositories below. http://www.mediafire.com/folder/nz5yjfckzzivz/Robowaifu_Resources

Message too long. Click here to view full text.

Edited last time by Chobitsu on 12/23/2021 (Thu) 06:51:08.
345 posts and 175 images omitted.
>>14971 Good ideas SophieDev, and yeah storing ideas here on the board for safekeeping is something most of us do, including myself.
Open file (201.29 KB 1160x773 Russia put-in Ukraine.jpg)
I now know how to use a multimeter and have confirmed that my buck-converter is correctly limiting voltage to the servos. Have thrown out a bunch of fried micro-servos that were a lost cause (kept a few parts for spares). Now, onto the more pressing subject: To Mr. Vladimir Putin, I appreciate your desk in the Kremlin is very well polished and shiny. In fact if I wasn't working on building a robot I would be polishing my desk and downgrading to Windows XP so my setup could be more like yours. However, at present you are causing me a problem. Because electricity prices in my country have doubled due to you ordering the gas pipelines closed, I can no longer 3D print large things because it's too expensive. Although, Mr.Putin, I realise it's not all your fault. You see, I told my government that they should've focused less on feminism, jigaboos and faggots and more on building nuclear power stations, but they wouldn't listen. So now all of our gas power plants are out of gas and all of our energy firms are going bust. So please could you kindly sort out your business with Ukraine so the rest of us can get on with building robots? Kind regards, SophieDev
>>15207 POTD TOP LOL <ywn a shiny desk full of XP in Soviet Russia. I'm glad you're sorting your servos/power systems. Hopefully the situation will improve before too long Anon. I'm sure you will figure things out as you go along. I pray for you SophieDev indeed for us all. Godspeed. >=== BTW Anon, your thread is nearly at the autosage bump limit. I'd suggest you begin #3 thread soon.
Edited last time by Chobitsu on 02/14/2022 (Mon) 03:04:20.
Thought I'd print something small and cheap but still useful: servo connector locks. Do you have keyless servo connectors that keep coming undone when your robot moves about? Slip some of these locks over the connection point and your robot's days of sudden-onset flaccid paralysis will be over! Obviously though, if your wires aren't long enough to accomodate your robot's range of movement and you have these servo connector locks on, then the servo wires are likely to yank out completely (or pull something off your robot) since the servo connectors can no longer slip out easily.
>>15211 That's great but BLDCs and ESCs mostly use other connectors. I only see these on the small servos. And in those cases they need to go into a breadboard for which I use the male to male breadboard cables. I'm mostly using XT-60 connectors and banana plugs, and try to get motors with the right connectors. These are very cheap on AliExpress. Make sure to get male and female and the right size, though. For plugs and the connectors. You only need one connector and the other side can use plugs. The connectors seem to be mostly yellow and the plugs without plastics golden (brass). On AliExpress they also have little screw terminals for cables, which are also very cheap and fit into a breadboard. Not sure about the name, DG-301 is written on the side and they're blue (others are green).

CNC Machine Thread Robowaifu Technician 05/12/2020 (Tue) 02:13:40 No.2991 [Reply] [Last]
Many of the parts needed to build our robowaifus will need to be custom made and they will need to be metal. For parts that have a high tolerance for imperfections a 3d printer can print a mold and then a small scale foundry can be used to cast the piece with metal (probably copper or aluminum). BUT there will be pieces that need a higher degree of precision (such as joints). For these pieces a CNC machine would be useful. CNC machines can widely range in size, price, and accuracy and I would like to find models suitable for our purposes. I know there are CNC machines available that can cut up to copper for under $300, but I don't know if that will be enough for our purposes. (https://www.sainsmart.com/products/sainsmart-genmitsu-cnc-router-pro-diy-kit?variant=15941458296905&currency=USD&utm_campaign=gs-2018-08-06&utm_source=google&utm_medium=smart_campaign) Some CNC machines can be used to engrave printed circuit boards and that may prove useful for our purposes as well. Are there any anons that know more about CNC machines? Anons looking to buy one ask your questions here.
56 posts and 10 images omitted.
>>11374 e.g., the plans for my waifu involve a build comprised solely of plastic printed parts, mainly polycarbonate for structural support. This is fairly easy and inexpensive to build an amateur 3d printer for
>>11375 >comprised solely of plastic printed parts, mainly polycarbonate for structural support It requires a better printer than other materials. It's not lighter than aluminium in relation to strength and parts like gears would wear down over time. Maybe you can do without metal parts or only using standardized metal parts, but it clearly has advantages to be able to use custom made parts. >>11374 You didn't address the problem that plastics are weaker and wearing off, so it was an appropriate answer, and yeah it is obvious that at least using some (custom made) metal parts has advantages. Maybe more so for the higher quality versions. Not necessarily many custom made ones in every case.
Open file (298.52 KB 963x1625 IMG_20210703_201416.jpg)
>>11410 >you may even be able to put some kind of metal plating for mechanical parts that interact That was one of ideas of using some custom made metal parts. Yes. >metals aren't very cost effective There are standardized parts, which are very cheap and companies that mill custom parts have been mentioned as a source in this this thread. >for the average user Some here want to build very cheap robowaifus, others might go for the more expensive version, which might be able to be build as a cheaper version with some drawbacks. >metal 3d printer That's only one option, there's milling, casting and ordering custom parts from companies. Also these printers might get cheaper at some point. >comparative stress-tests for materials CNC Kitchen on YouTube.
>>11417 What happened here? He deleted his posting I replied to?
Just going to drop this here. https://docs.v1engineering.com/mpcnc/intro/ https://github.com/V1EngineeringInc/MPCNC_Primo https://www.thingiverse.com/thing:790533 Tabletop CNC that is mostly manufacture on a 3d printer with the rails being relatively cheap conduit pipping. It is modular so you can slap on a router, plasma cutter or laser depending on what you are working with. Additionally the system can be customized to any size provided you can find belts long enough for it and are willing to tolerate some looseness and loss of accuracy. The creator has a lot of other interesting prints if you are interest in checking them out https://www.prusaprinters.org/social/47417-ryan-z/prints

Robowaifu references Anonymous 09/09/2019 (Mon) 00:09:49 No.1 [Reply] [Last]
My favorite robowaifu is Chii. I'd like to see yours.
100 posts and 87 images omitted.
Open file (2.02 MB 1920x1080 robo_pudi_waifu.jpg)
>>10056 Very nice design Anon. I wish you well in your efforts for her.
>>42 I came across this by accident, and I was terribly disappointed it didn't have any Latin (the language) in it.
>>15510 So, can you tell us what it's about Anon? I've never seen it outside of this thread tbh.
Open file (81.39 KB 350x499 latin-1.jpg)
>>15511 It's a cute doujin about a man and his pre-owned robowaifu that has emotional damage. You can read it here: https://manhwahentai.me/webtoon/latin/vol-1-chapter-1-latin/
>>15512 Thanks!

Open file (304.39 KB 1200x801 02.jpeg)
Open file (524.02 KB 1024x682 03.jpg)
Open file (987.46 KB 2560x1918 05.jpeg)
/robowaifu/meta-4: Rugged Mountains on the Shore Robowaifu Technician 09/09/2021 (Thu) 22:39:33 No.12974 [Reply] [Last]
/meta & QTDDTOT Note: Latest version of /robowaifu/ JSON archives available is v220117 Jan 2022 https://files.catbox.moe/l5vl37.7z If you use Waifusearch, just extract this into your 'all_jsons' directory for the program, then quit (q) and restart. Note: Final version of BUMP available is v0.2g (>>14866) >--- Mini-FAQ >A few hand-picked posts on various topics -Why is keeping mass (weight) low so important? (>>4313)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 01/17/2022 (Mon) 09:22:09.
353 posts and 122 images omitted.
>>15397 Thanks
>>15395 Sorry about that mistake.
New Thread New Thread New Thread >>15434 >>15434 >>15434 >>15434 >>15434 New Thread New Thread New Thread
>>13018 > (>>16395, >>16433 >Headpat Waifus -related) >=== -add original-use crosspost
Edited last time by Chobitsu on 05/24/2022 (Tue) 07:00:58.

Report/Delete/Moderation Forms