/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Happy New Year!

The recovered files have been restored.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“Fall seven times, stand up eight.” -t. Japanese Proverb


Open file (485.35 KB 1053x1400 0705060114258_01_Joybot.jpg)
Robowaifu Simulator Robowaifu Technician 09/12/2019 (Thu) 03:07:13 No.155
What would be a good RW simulator. I guess I'd like to start with some type of PCG solution that just builds environments to start with and build from there up to characters.

It would be nice if the system wasn't just pre-canned, hard-coded assets and behaviors but was instead a true simulator system. EG, write robotics control software code that can actually calculate mechanics, kinematics, collisions, etc., and have that work correctly inside the basic simulation framework first with an eye to eventually integrating it into IRL Robowaifu mechatronic systems with little modifications. Sort of like the OpenAI Gym concept but for waifubots.
https ://gym.openai.com/
Someone mentioned Webots: Robot Simulator here >>10531 Link: https://cyberbotics.com/ >It has been designed for a professional use, and it is widely used in industry, education and research. Cyberbotics Ltd. maintains Webots as its main product continuously since 1998. >Webots core is based on the combination of a modern GUI (Qt), a physics engine (ODE fork) and an OpenGL 3.3 rendering engine (wren). It runs on Windows, Linux and macOS. Webots simulations can be exported as movies, interactive HTML scenes or animations or even be streamed to any web browser using webgl and websockets. >Robot may be programmed in C, C++, Python, Java, MATLAB or ROS with a simple API covering all the basic robotics needs.
>>1708 I apologize Anon, for not thanking you for this image when you posted it. I actually did appreciate it back then, but I was too distracted to share my gratitude at the time. So thanks! :^)
>>10073 We'd be interested to hear how both the training and your 'side-project' worked out Anon. Any news to share with us?
Open file (1.31 MB 2400x3600 Roastinator.png)
>>1092 >>1093 >>1101 Already been meme'd. I think something very similar to this was originally proposed as a "troll" on a certain chan, which literally led to some articles and "focus groups" that led to the creation of a monitoring group who has eyes on all "robot" threads/convos across the internet. > The roastie fears the robot-samurai.
Open file (56.08 KB 500x356 beewatcher_lg.gif)
>>11010 Haha, well that's interesting Anon! Can you give us more details on these 'watchers'? Somebody might need to keep eyes on them tbh. >And now all the Hawtchers who live in Hawtch-Hawtch are watching on watch watcher watchering watch , watch watching the watcher who's watching that bee. You're not a Hawtch-Watcher you're lucky you see!” >t― Dr . Seuss , Did I Ever Tell You How Lucky You Are?
>>11013 Hmmm. Use machines to keep an eye on machines? In a time of deepfake video and digitally altered footage, I just hope they believe the camera feeds they're watching. 😉
What looks to be a very useful header-only C++ wrapper around the OpenGL C API. I'll try to make some time to have a look at it over the summer. https://oglplus.org/oglplus/html/index.html https://github.com/matus-chochlik/oglplus
Open file (407.22 KB 892x576 MoxRigForBlender.png)
Open file (23.94 KB 566x698 momo_rig.png)
Here's something I found through our Japanese colleagues. MoxRig / MomoRig https://mox-motion.com/ - I didn't try it out, but it seems to be useful for animation of human-like movement and simulation of robots.
>>11497 Neat! Thanks Anon, I'll give it a lookover.
>>11022 >emoji friendly islamic reminder this is a chan don't use emoji please
Open file (1.18 MB 640x360 mujoco_02.webm)
Open file (574.04 KB 640x360 mujoco_05.webm)
Open file (837.88 KB 640x360 mujoco_06.webm)
Open file (567.63 KB 640x360 mujoco_04.webm)
MuJoCo's entire codebase has just been open-sourced: https://github.com/deepmind/mujoco This is the same physics engine OpenAI used to train a real robot hand to solve a Rubik's cube. https://www.youtube.com/watch?v=x4O8pojMF0w
>>16415 Amazing. Thanks very much Anon!
Open file (100.78 KB 1200x675 fb_habitat.jpg)
>>16415 Mujoco is state of the art in ~real-time jointed rigid body physics simulation, nice taste, anon. Still, it's not a complete environmental simulator, it is very useful for limited domain hard-dynamics manipulation and movement experiments. >155 I think FAIR's habitat simulation environment[1][2] is the most sensible choice for our needs. It's a complete system with physics, robot models, rendering stack and ML integrations. It would be of major help to the project if we developed a waifu-specific randomized (to facilitate sim2real generalization) sim-environment, and collected enough behavioral data traces to pinpoint the necessary behavioral patterns, similar to deepmind's recent imitation learning (IL) tour de force: https://www.deepmind.com/publications/creating-interactive-agents-with-imitation-learning If you choose this tool, feel free to ask for help if it somehow breaks. 1. https://github.com/facebookresearch/habitat-lab 2. https://github.com/facebookresearch/habitat-sim
>>16446 Thanks Anon! We'll have a look into it. >that ambient occlusion Nice. Cheers.
>>16446 So far, assimp is breaking the build. After recursive checkout, merging git submodule foreach git merge origin master led to these errors during that process CONFLICT (content): Merge conflict in code/Common/Version.cpp ... CONFLICT (modify/delete): assimpTargets-release.cmake.in deleted in origin and modified in HEAD. Version HEAD of assimpTargets-release.cmake.in left in tree. Pressing on with abandon, I did get a little way before it failed with: FAILED: deps/assimp/code/CMakeFiles/assimp.dir/Common/Version.cpp.o I wanted to give it a shot at least once, but ATM I can't afford the time required to dig in and try to fix such a complex system's build from source. But thanks Anon! It certainly looks interesting and I'm pleased to see a Globohomo behemoth such as F*cebook put out something this big with an MIT license. >=== -minor grammar edit
Edited last time by Chobitsu on 05/25/2022 (Wed) 09:37:45.
>>16453 I managed to build and run it on debian sid with a script inspired by this document https://github.com/facebookresearch/habitat-sim/blob/main/BUILD_FROM_SOURCE.md Basically you clone the repo, checkout to the latest stable tag, update submodules recursively via the usual command git submodule update --init --recursive. I had to comment out a section in setup.py that deals with cmake path to make it work, like this: #try: # import cmake # # If the cmake python package is installed, use that exe # CMAKE_BIN_DIR = cmake.CMAKE_BIN_DIR #except ImportError: CMAKE_BIN_DIR = "" Ensure you have the cmake, debian packages and python libraries they require, then do the python3 setup.py install --bullet. It should build a several hundered source files via cmake and install the package. I managed to avoid using the conda for this, it's simply installed system-wide. When I/we will run multiple data-generating simulations at scale, some form of automated reproducible build & distribution system will be necessary, such as nix/guix or a container/vm.
>>16462 Thanks! I appreciate you're avoided conda for this. I prefer to stick closer to the hardware when feasible. I'll give your instructions a shot at some point. I'm going to have to set up a dedicated machine at some point (hopefully soon). >nix/guix or a container/vm Do you have any preferences? I'm certainly averse to anything proprietary tbh. >palace Fancy rooms for fancy robowaifus! :^) <those portraits are fabulous
Open file (127.08 KB 769x805 GardevoirNierAutomata.jpg)
Found out about DALL·E mini: https://huggingface.co/spaces/dalle-mini/dalle-mini Can generate cute robowaifus. Like this example of Gardevoir in Nier Automata.
>>16648 Excellent find Pareto Frontier. Any advice on running a local instance?
>>16652 It's hard but doable, boils down to making this notebook work https://github.com/brokenmold/dalle-mini/blob/main/tools/inference/inference_pipeline.ipynb I don't have time to bring it up rn
>>16645 >>16648 Nice. Thanks Anon.
Opensimulator thread: >>12066 unreal engine file with the maidcom project manequin >>25776
Open file (63.55 KB 765x728 Screenshot_149.png)
>An encyclopedia of concept simulations that computers and humans can learn from. An experimental project in machine learning. https://concepts.jtoy.net >Examples of concepts that are grounded in the physical world: https://blog.jtoy.net/examples-of-concepts-that-are-grounded-in-the-physical-world/ Somewhat related: https://blog.jtoy.net/log-of-machine-learning-work-and-experiments/
I have a hunch that adding such patterns like shown in the video might be the key to simulate bodies, especially facial expressions: https://youtu.be/UOjVNT25eHE
>>29163 Thanks alot, Noido Dev. This is a gem. Blue Sky is one of the best studios out there. They have a ton of talented individuals.
I think I looked around long enough and will start small with the following plan for the next weeks: - explore the robotic learning kit provided by epic - write an interface to get data in and out from a running unreal application The robotic learning project seems to have some basic sensors and motors implemented, perfect stuff to develop the interface. After some googling it seems like a UDP/TCP connection will be the way to go. After that I'll figure out the next steps.
Today successfully wrote a TCP Client/Server as WindowsForm application and was able to setup a TCP connection to a running Unreal Project. For the Unreal TCP part I used this Plugin: https://github.com/getnamo/TCP-Unreal This way any other software can be connected via TCP (just IP and Port) to the running application. Next I will have a look into how to send data from and to unreal. Maybe JSON? I made a TCP Client/Server for debugging purposes. Those two are written in C#.
>>29242 >>29296 That's great. Keep us updated. Though, if you start creating something the new prototype thread might be the better place to post it: >>28715
>>29242 >>29296 That sounds very encouraging, SchaltkreisMeister! Good luck getting this system up and running successfully. Cheers. :^)
David Browne did some fast muscle design simulation: https://youtu.be/J7RxSPLLw-s
>>29390 Cool. I'm going to check this out NoidoDev, thanks!
I had some idea about using symbols to let the AI do reasoning about objects and it's position in the world. I thought of something like ASCII art. So basically a picture of a view would be mapped into a 2D or 3D model of the world based on symbols which can be moved around. Then I had the idea, that there might be game engines being useful as a base for that. I found these: > PyPlayScii is a Python package that enables an simple object oriented implementation of ascii art games. By asigning the shapes of the game objects by texts seprated by newline characters and determining what to do every frame, you can quickly implement an ascii art game which can be run directly on terminal window. The following shows an example of an ascii art game implemented by PyPlayScii. https://pypi.org/project/pyplayscii/ A alternative in Scala and probably more in use and supported would be Cosplayengine https://cosplayengine.com
>>32210 This is a very-cool idea, Anon. Also, thanks for the links. Cheers. :^)
Okay i deleted the other thread youre welcome. From what ive seen online the industry uses either gazebo or nvidia issac for simulations. But mostly gazebo. honorable mentions are uroboviz for unreal, unity mujoco integration and copellia sim.
>>34895 Why not repost the information again here in the better thread, Anon? >Isaac NVIDIA is literally billions into their robo-platforms now. Accelerate! https://developer.nvidia.com/isaac
Open file (4.59 KB 225x225 download (30).jpeg)
.>>34932 okay issac sounds good. Can you all do me a favor? can we collaborate? Listen im going to do this thing by myself or with someone else but i am aware that what i will do will be more limited in scope. without a simulation you can not train ai, without a simulation there can be no collaboration between different people without buying the components. This is where it should of started from the get go. If we wish to collaborate there needs to be a desicion on what urdf model will be used and its a lot of work and i dont see any willingness from most people here to do it or learn how to do it. i will skip the simulation and do the small scale robot cause itd be less work for me i think. Unless theres actual collaboration.
>>35117 I'm sure Anons would be willing to collaborate with you if A) they have the time, energy, and money to I currently am lacking in two of these, and B) they see you already leading the way with an innovation. Your choice of a kot pic is ironically apropos : telling anons what to/not-to do is like herding literal cats. >tl;dr You can't lead Anons 'from behind', but only from the front. And you certainly aren't going succeed here in that endeavor by bitching-and-moaning/browbeating like some kind of 'moid bro! :DD >ttl;dr Do something great, peteblank. I'm quite confident others will follow. Why not follow closely in one of the already-progressing projects like dear Kibo-chan, or with HannahDev's work? You create something like that and share your progress in the R&D throds, then I'm sure you'd have many Anons here become intredasted in your doings. Good luck, Anon.
Open file (2.32 MB 3072x4080 IMG_20241226_230800.jpg)
>>35118 if people arent willing to pay with (hard) labor then this is not an opensource project. this is show and tell. pic related is my fourth attempt at the ferrite core. needless to say i might go back to it but itll have to wait. ill maje and show a small scale robot that can walk on late january. i wont explain how it works however.
>>35141 ive stepped on those before i dont get why youre wasting time making these when they cost nothing to buy prefab i always wondered what these cylinder things on cables were, turns out theyre just ferrite cores, why theyre there who knows, the explanation sounds like a superstition to me
>>35141 >I won't explain how it works You don't need to explain how your robot works. I can explain how it works if anyone cares to know. As for your magnet issues, consider the E ferrite core. Cheap and good for magnetic attraction if used properly. >>35142 >I don't know why they're there They keep voltages stable via saturating the ferrite to resist sudden changes in the electro-magnetic fields of the wires. Essentially, if one device produces unsafe voltages ripples for a moment, the ferrite protects the other device. They are also good for filtering our electro-magnetic interference, which is good if you're near a microwave or router. Though, modern woven shielding is frankly better most of the time. >Seems like superstition Given the stability of modern electronics, it mostly is. That being said, lower end cheap electronics can skip out on passives which prevent these transient spikes from being a problem. Either way, ferrites are cheap as crisps. Better to have 'em to keep your devices safe, also good for audio equipment and protecting data transfer around any kind of inductive devices like motors.
>>35144 i already got eferrite cores. They keep the magnetism contained. i ould try splitting them in half i guess. the only seller i could fine sellkng pure iron rods is one seller on amazon for like $17 and thats it and id have to pay shipping to thailand on top no thanks. i already got the materials anyways. i just need to make a vertical two piece mold with a lid and cap instead of a horizontal two piece mold. >i can ecplain how it works and grommet is going to make the waifu bot split wood with an axe
>>35146 Yes I want my waifu to split wood. Clean floors, make things nice for me.
>>35160 Might be easier teaching the robit how to operate a hydraulic wood splitter.
>>35161 This, but it'd sure be nice to have a waifu that could handle a maul for you in a pinch. :D
>>35117 Right now, I'm not even sure how to do a full simulation of what I would want to build. I hope there will just be a general humanoid model that I can download and approximate mine to it. I don't believe in humanoid robots doing their movements through precise planning. It will be a guesstimate and then looking at the sensors while moving. The current Teslabot also seems to use a form of very fast neural network: https://youtu.be/xxoLCQTN0KA
>>35141 This and all the following until my post is OT in this thread.
Open file (523.86 KB 1920x1080 Screenshot (73).png)
>>35179 I'm working on the cad for the body. Its important to me that it has an adjustable spine because that will play a bit part in making the robot flexible. I will share the cad but I don't and it to be public. I want to share it with people that will be working on this. Please invite me on github https://github.com/peteblank The choice of her body being thicc is not only because of preference but because it needs to house the components. The weight distribution will also be selected based on the center of gravity. The center of gravity should be on the middle(I think) I have a theory that it ought to have a decent chest weight to act as a counterweight for the spine pulling(think of the stability of pulling a bag of cement vs puling a wrench) I'm planning on making the test robot walk near the end of January. it'll have empty boxes on the chest to add and take away weights to test the balance. Since its been decided the simulation should be on isaac(i'd prefer gazebo because its the industry standard and you can get a job with it) Can't find any gazebo/nvidia isaac tutorials atm
>>35202 no seems isaac is the industry standard now. ok then.

Report/Delete/Moderation Forms
Delete
Report