/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

The canary has FINALLY been updated. -robi

Server software upgrades done, should hopefully keep the feds away. -robi

LynxChan 2.8 update this weekend. I will update all the extensions in the relevant repos as well.

The mail server for Alogs was down for the past few months. If you want to reach out, you can now use admin at this domain.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB


(used to delete files and postings)

He was no longer living a dull and mundane life, but one that was full of joy and adventure.

ROBOWAIFU U Robowaifu Technician 09/15/2019 (Sun) 05:52:02 No.235
In this thread post links to books, videos, MOOCs, tutorials, forums, and general learning resources about creating robots (particularly humanoid robots), writing AI or other robotics related software, design or art software, electronics, makerspace training stuff or just about anything that's specifically an educational resource and also useful for anons learning how to build their own robowaifus. >tl;dr ITT we mek /robowaifu/ school.
Edited last time by Chobitsu on 05/11/2020 (Mon) 21:31:04.
Open file (229.05 KB 500x572 LeonardoDrawing.jpg)
'''Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs''' According to Alexander Stepanov (in the forward to The Boost Graph Library, 2001) This man John Backus and this Turing Award lecture paper were inspirational to the design of the STL for C++. The STL underpins the current state of the art in generic programming that must be both highly expressive+composable, but must also perform very fast as well. Therefore, indirectly so does John Backus’s FP system and for that we can be grateful.
Open file (24.66 KB 192x358 Backus.jpg)
>>5191 Backus also invented FORTRAN (back when that was a first of it's kind for programming portability), and is one of the smartest men ever in the entire history of computing. https://ethw.org/John_Backus
>>5173 Started watching this, its pretty good.
Open file (650.07 KB 1030x720 thetimehathcome.mp4)
Anyone know some good tutorials for getting started in Godot with 3D? I just wanna have materials and animations load correctly and load a map with some basic collision checking to run around inside.
Open file (70.70 KB 895x331 b2_its_time.png)
>>5379 I haven't looked into Godot yet myself, but I'd like to at some point. Please let us know if you locate something good Anon. >that vid leld.
Open file (1.98 MB 1280x720 how2b.mp4)
>>5380 It's painful but if I find anything good I'll post it here. Blender videos are 'how to do X in 2 minutes', but Godot videos are 30 minutes of mechanical keyboard ASMR and explaining you should watch the previous video to understand everything while they're high on helium. There seems to be errors with importing animations of FBX models in 3.2.2 but GLTF works mostly okay. I don't think it's too much of an issue because the mesh of this model has been absolutely destroyed by my naive tinkering and lazy weight painting. FBX corrupted my project somehow but when I start fresh with GLTF all the textures load and everything works great. I accidentally merged by distance all the vertices and destroyed her face but if anyone wants to play around with the 2B model I used, here you go: https://files.catbox.moe/1wamgg.glb Taken from a model I couldn't get to import into Blender correctly: https://sketchfab.com/3d-models/nierautomata-2b-cec89dce88cf4b3082c73c07ab5613e7 I'll fix it up another time or maybe find another model that's ready for animation.
Open file (24.65 KB 944x333 godot3_logo.png)
Found a great site for Godot tutorials and a channel that goes along with it. Text: https://kidscancode.org/godot_recipes/g101/ Videos: https://www.youtube.com/c/KidscancodeOrg/playlists
Understanding Variational Autoencoders (VAEs) >Introduction >In the last few years, deep learning based generative models have gained more and more interest due to (and implying) some amazing improvements in the field. Relying on huge amount of data, well-designed networks architectures and smart training techniques, deep generative models have shown an incredible ability to produce highly realistic pieces of content of various kind, such as images, texts and sounds. Among these deep generative models, two major families stand out and deserve a special attention: Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs). https://towardsdatascience.com/understanding-variational-autoencoders-vaes-f70510919f73?gi=8b456161e353
Tutorial on Variational Autoencoders >In just three years, Variational Autoencoders (VAEs) have emerged as one of the most popular approaches to unsupervised learning of complicated distributions. VAEs are appealing because they are built on top of standard function approximators (neural networks), and can be trained with stochastic gradient descent. VAEs have already shown promise in generating many kinds of complicated data, including handwritten digits, faces, house numbers, CIFAR images, physical models of scenes, segmentation, and predicting the future from static images. This tutorial introduces the intuitions behind VAEs, explains the mathematics behind them, and describes some empirical behavior. No prior knowledge of variational Bayesian methods is assumed.
Open file (387.54 KB 1999x1151 transformer.png)
Great video on programming transformers from scratch in PyTorch: https://www.youtube.com/watch?v=U0s0f995w14 You'll need to know a bit how they work first: https://www.youtube.com/watch?v=TQQlZhbC5ps And another video if you want to understand the details of the design: https://www.youtube.com/watch?v=rBCqOTEfxvg And if you want a full lecture explaining how information flows through the network: https://www.youtube.com/watch?v=OyFJWRnt_AY Original paper for reference: https://arxiv.org/abs/1706.03762 If you're new to machine learning it might seem impossible to learn because there's a lot of pieces to grasp but each one is relatively simple. If you study and figure out one new piece a day, eventually you'll understand how the whole thing works and have a good foundation for learning other architectures.
>related crosspost >>9065
Open file (239.89 KB 572x870 ROS for Beginners.png)
A very detailed book that explains - 1.) How to install Ubuntu Linux 2.) How to install the Robot Operating System on Ubuntu. 3.) How to begin programming your robot using C++ Ideal for us here since I know a lot of people use Raspbian or at least some Linux distro to operate their robots, and I think the two are very similar. (Although if you have Windows 10 you can also install ROS on that, too).
>>9081 Thanks! Sounds like a very useful book. I tried to install ROS before once, but I ran into numerous dependency challenges. But this was a few years ago (maybe 3?) and maybe it's easier now. Also, this book sounds like it makes the process straightforward as well. I look forward to digging into it.
Open file (341.64 KB 894x631 EM.png)
Expectation maximization is an iterative method to find maximum likelihood estimates of parameters in statistical models, where the model depends on unobserved latent variables. https://en.wikipedia.org/wiki/Expectation%E2%80%93maximization_algorithm How EM is useful solving mixture models: https://www.youtube.com/watch?v=REypj2sy_5U How it works: https://www.youtube.com/watch?v=iQoXFmbXRJA Longer lecture on EM algorithms for machine learning: https://www.youtube.com/watch?v=rVfZHWTwXSA EM was applied to hindsight experience replay (which improves expectations of future states from past failures) to greatly improve the learning efficiency and performance, particularly in high-dimensional spaces: https://arxiv.org/abs/2006.07549 Hindsight Experience Replay: https://arxiv.org/abs/1707.01495
I haven't come across a good article or post on pre-training neural networks but I think it's a really important subject for anyone doing machine learning. Recently when trying to convert the T5 model into an autoencoder I made the mistake of forgetting to pre-train it on autoencoding before converting the hidden state into a variational autoencoder. Because of this both the decoder was unable to decode anything useful and it was getting random input from the untrained VAE, making it extraordinarily difficult to train. After fixing this I also locked the parameters of the T5 encoder and decoder to further improve training efficiency by training the VAE specifically on producing the same hidden state output as its hidden state input so the decoder doesn't become skewed learning how to undo the VAEs inaccuracy. Once the VAE reaches a reasonable accuracy then I will optimize the whole model in tandem while retaining the VAEs consistency loss. Pre-training is also really important for reinforcement learning. I can't remember the name of the paper right now but there was an experiment that had an agent navigate a maze and collect items, but finding a reward from a randomly initialized network is nearly impossible, so before throwing the agent into the main task they taught it with auxiliary tasks such as how to smoothly control looking around the screen and how to predict how the scene changes as it moves around. A similar paper to this was MERLIN (Memory, RL, and Inference Network) which was taught how to recognize objects, memorize them and control looking around before being thrown into the maze to search for different objects: https://arxiv.org/abs/1803.10760 For learning to happen efficiently a network has to learn tasks in a structured and comprehensive way, otherwise it's like trying to learn calculus before knowing how to multiply or add. The problem has to be broken down into smaller simpler problems that the network can learn to solve individually before tackling a more complicated problem. Not only do they have to broken down but they need to be structured in a hierarchy, so the final task can be solved with as few skills as possible. The issue of pre-training, transfer learning and how to do it properly will become more important as machine learning tackles more and more complicated tasks. The subject itself could deserve its own thread one day, but for now just being aware of this will make your experiments a lot easier.
Open file (96.49 KB 356x305 roboticist4.jpg)
Open file (104.97 KB 716x199 roboticist1.jpg)
Open file (359.55 KB 500x601 roboticist9_0.jpg)
>Mark Tilden on “What is the best way to get a robotics education today?” https://robohub.org/mark-tilden-on-what-is-the-best-way-to-get-a-robotics-education-today/
Synthesis of asynchronous circuits >Abstract >The majority of integrated circuits today are synchronous: every part of the chip times its operation with reference to a single global clock. As circuits become larger and faster, it becomes progressively more difficult to coordinate all actions of the chip to the clock. Asynchronous circuits do not suffer from this problem, because they do not require global synchronization; they also offer other benefits, such as modularity, lower power and automatic adaptation to physical conditions. >The main disadvantage of asynchronous circuits is that techniques for their design are less well understood than for synchronous circuits, and there are few tools to help with the design process. This dissertation proposes an approach to the design of asynchronous modules, and a new synthesis tool which combines a number of novel ideas with existing methods for finite state machine synthesis. Connections between modules are assumed to have unbounded finite delays on all wires, but fundamental mode is used inside modules, rather than the pessimistic speed-independent or quasi-delay-insensitive models. Accurate technology-specific verification is performed to check that circuits work correctly. >Circuits are described using a language based upon the Signal Transition Graph, which is a well-known method for specifying asynchronous circuits. Concurrency reduction techniques are used to produce a large number of circuits that conform to a given specification. Circuits are verified using a bi-bounded simulation algorithm, and then performance estimations are obtained by a gate-level simulator utilising a new estimation of waveform slopes. Circuits can be ranked in terms of high speed, low power dissipation or small size, and then the best circuit for a particular task chosen. >Results are presented that show significant improvements over most circuits produced by other synthesis tools. Some circuits are twice as fast and dissipate half the power of equivalent speed-independent circuits. Examples of the specification language are provided which show that it is easier to use than current specification approaches. The price that must be paid for the improved performance is decreased reliability, technology dependence of the circuits produced, and increased runtime compared to other tools.
Graph Algorithms: Practical Examples in Apache Spark and Neo4j Whether you are trying to build dynamic network models or forecast real-world behavior, this book demonstrates how graph algorithms deliver value – from finding vulnerabilities and bottlenecks to detecting communities and improving machine learning predictions. Register to Download O'Reilly's Graph Algorithms for Free! We walk you through hands-on examples of how to use graph algorithms in Apache Spark and Neo4j. We include sample code and tips for over 20 practical graph algorithms that cover importance through centrality, community detection and optimal pathfinding. Read this book to: Learn how graph analytics vary from conventional statistical analysis Understand how classic graph algorithms work and how they are applied Dive into popular algorithms like PageRank, Label Propagation and Louvain to find out how subtle parameters impact results Get guidance on which algorithms to use for different types of questions Explore algorithm examples with working code and sample datasets for both Apache Spark and Neo4j See how connected feature extraction increases machine learning accuracy and precision Walk through creating an ML workflow for link prediction combining Neo4j and Apache Spark https://neo4j.com/graph-algorithms-book (Upload is too large)
>>10398 >(Upload is too large) No, the AlogSpace site has a 20MB upload limit IIRC Anon. Fits comfortably within that.
>>10405 My download had 23MB, if it's the same book they maybe altered something.
>>10413 Just compare the two yourself would be my suggestion.
Finally, I have at least a search term to find data on the size of human limbs: Anthropomorphic reference data A lot to read though: https://www.sciencedirect.com/topics/engineering/anthropometric-data It still bothers me, that I can't just look up the weight of limbs or the length of bones and such, somewhere.
Open file (110.29 KB 318x493 powerdevelopment.png)
>>10542 There are some decent digital human body encyclopedias you can find on the usual pirating sites out there. I was more into the in-depth strength training materials when looking at building at a humanoid robot. This sort of valuable information can't be found in reference data.
>>10548 Good thinking Anon. Not him, but I've worked through Convict Conditioning, and while I was doing it I thought often about the fact that the very same issues of dynamics & strength were actually going to be important design & engineering concerns for us as we build robowaifus. That opinion hasn't changed in the slightest now that I'm actually looking harder at the skeletal designs for her, etc.
Latest Release Completes the Free Distribution of A Knowledge Representation Practionary: https://www.mkbergman.com/2461/entire-akrp-book-now-freely-available/ >A Knowledge Representation Practionary is a major work on knowledge representation based on the insights of Charles S. Peirce, shown at age 20 in 1859, who was the 19th century founder of American pragmatism, and also a logician, scientist, mathematician, and philosopher of the first rank. The book follows Peirce’s practical guidelines and universal categories in a structured approach to knowledge representation that captures differences in events, entities, relations, attributes, types, and concepts. Besides the ability to capture meaning and context, the Peircean approach is also well-suited to machine learning and knowledge-based artificial intelligence.
>>10634 Wow, sounds like a remarkable work Anon. Look forward to reading.
>'''"Mathematics for Machine Learning - Why to Learn & What are the Best Free Resources?"'''
Talented and very smart technical animator. Science & Technology topics. https://www.youtube.com/c/ThomasSchwenke-knowledge/playlists
>>4660 >related crosspost (>>11211)
DeepMind YT playlists https://www.youtube.com/c/DeepMind/playlists This anon recommended it (>>11555). I'm currently working through the 8-video Deep Learning Introduction list.
Computer Systems: A Programmer's Perspective, 3/E (CS:APP3e) Randal E. Bryant and David R. O'Hallaron, Carnegie Mellon University >Memory Systems >"Computer architecture courses spend considerable time describing the nuances of designing a high performance memory system. They discuss such choices as write through vs. write back, direct mapped vs. set associative, cache sizing, indexing, etc. The presentation assumes that the designer has no control over the programs that are run and so the only choice is to try to match the memory system to needs of a set of benchmark programs. >"For most people, the situation is just the opposite. Programmers have no control over their machine's memory organization, but they can rewrite their programs to greatly improve performance. Consider the following two functions to copy a 2048 X 2048 integer array: void copyij(long int src[2048][2048], long int dst[2048][2048]) { long int i,j; for (i = 0; i < 2048; i++) for (j = 0; j < 2048; j++) dst[i][j] = src[i][j]; } void copyji(long int src[2048][2048], long int dst[2048][2048]) { long int i,j; for (j = 0; j < 2048; j++) for (i = 0; i < 2048; i++) dst[i][j] = src[i][j]; } >"These programs have identical behavior. They differ only in the order in which the loops are nested. When run on a 2.0 GHz Intel Core i7 Haswell processor,, copyij runs in 4.3 milliseconds, whereas copyji requires 81.8—more than 19 times slower! Due to the ordering of memory accesses, copyij makes much better use of the cache memory system. http://csapp.cs.cmu.edu/3e/perspective.html >=== -minor fmt patch
Edited last time by Chobitsu on 08/26/2021 (Thu) 17:10:51.
The Elements of Computing Systems, second edition: Building a Modern Computer from First Principles Noam Nisan, Shimon Schocken >I Hardware >"The true voyage of discovery consists not of going to new places, but of having a new pair of eyes." >t.Marcel Proust (1871–1922) >This book is a voyage of discovery. You are about to learn three things: how computer systems work, how to break complex problems into manageable modules, and how to build large-scale hardware and software systems. This will be a hands-on journey, as you create a complete and working computer system from the ground up. The lessons you will learn, which are far more important than the computer itself, will be gained as side effects of these constructions. According to the psychologist Carl Rogers, “The only kind of learning which significantly influences behavior is self-discovered or self-appropriated—truth that has been assimilated in experience.” This introduction chapter sketches some of the discoveries, truths, and experiences that lie ahead. 33E8664A26F52769692C070A31A96CCE
>(>>15925 related crosslink, OS design)
>just dropping this here for us, since neither seem to be present ITT yet? https://functionalcs.github.io/curriculum/ https://teachyourselfcs.com/ https://www.mooc.fi/en/ (pozz-warning, but much good stuff as well)
>(>>16124, related crosspost)
Open file (33.22 KB 342x443 1651114786655-0.jpg)
So I think the time has finally come at last. I hope for the fortitude of soul to finally pursue maths. As a repeat High School dropout they kept putting me back in anyway lol, I absolutely loathed (and do even moreso today) the modern public """education""" systems. So I felt at the time my decisions were well-merited. Heh. So, fast-forward to today and I barely know how to add 2+2 haha. :^) Missing out on the basics of algebra, trig, geometry, pre-calc, calc, stats, etc., is proving a big hindrance to my progress today for us with robowaifus. Even though I'm now an adult man, I think I can still pick it up. The challenge is making the time to study on top of my already-overflowing plate, and the AFK pressures of keeping body & soul connected together. >tl;dr I'm starting with Euler's, wish me luck Anons! > >P.S. Feel free to pester me once a year -- say, during Summers -- to know how this little project is going. Sharp pointy-sticks can be a good thing after all. :^)
Open file (284.88 KB 1440x1080 MILK.jpg)
>>16302 I know that feel. I also dropped out and ended up math illiterate. I feel like the only useful subjects in school were Phys. Ed. and Math. Even the ones that sound useful on paper like Philosophy weren't useful, at least as far as what school taught of them goes. A system that wastes entire childhoods has got to be the most evil idea ever.
>>16302 I know it hasn't been a year yet, but how's it going? >Elements of Algebra If you can start without knowing the basics of algebra and end up understanding that book, then you have some serious potential. A lot of higher mathematics mostly requires a ton of tenacity and creativity trying to understand what people much further ahead than you are trying to explain. Usually our education system doesn't force students to go through that unless they've chosen to get a degree in mathematics. Elements of Algebra looks like it was written in the same tradition. It reads like it's meant for highly educated and highly dedicated people, just ones that haven't studied much math. If you get stuck, don't feel like it's cheating to look up other material or ask for help. The goal is not to get through the material on your own. The goal is to develop the right intuition so that the language of math becomes second nature, and all is fair in that pursuit. Do bang your head against some problems until you figure them out, because it's good to prevent yourself from getting lazy about exercising your brain, but don't do it for every single problem you encounter, since that will slow you down too much.
Open file (37.81 KB 346x202 misc.png)
How to Get Started with Animatronics – Thought Process, Workflow, Resources and Skills https://www.youtube.com/watch?v=8VzQshnrvN0 Will Cogley is knowledgeable with a legit ME degree. He doesn't sound like a faggot, and therefore isn't highly off-putting to listen to. He's creative and skilled, and he's accomplished in areas pertinent to robowaifu developement. I'd recommend you subscribe to his channel.
>>18698 Good idea, be he isn't new to the board. We followed his videos one or two years ago. I think he stopped at some point. One big problem is to keep in mind which video or channel covered what kind of topic. I think his moth mechanism was interesting, or how he approached speaking syllables or such. Also, his work on hands was interesting. However, problem with the common animatronics is, that they are not water proof. Especially in regards to the eye mechanism. Still a good way to get a grip on what challenges we have to go through.
>>16302 >>17436 If anyone wants a math tutor I could fill the role. I have an engineering degree and one of my minors is in math. Besides tutoring will help me keep my math skills sharp. @ribozyme:matrix.org is the best way to reach me. I might make a group depending on how many people are interested.
If anyone has some resources on material science that would be greatly appreciated.
>>18705 Thank you Ribose. I may take you up on it, since preserving my inanities over learning maths is hardly important to our community here. Please allow me some time to sort my current work/school schedules out.
>>18705 Thanks again Ribose. So what would be a good study guide for a very smart adult, but one who has 'formal' maths training only through about 9th grade, say pre-Algebra? OTOH, I've already written high-performance software that does sophisticated integrations in 2D & 3D space in professional studio environment settings. My 3D visualizaition and imagination skills are generally quite strong, but I have basically little-to-no technical training beyond self-taught efforts (apart from animation). So yeah, I'm a bit of a weird mix for maths, Ribose! :^) >=== -minor prose edit
Edited last time by Chobitsu on 01/29/2023 (Sun) 15:05:35.
>>19263 I recommend khan academy. https://www.khanacademy.org/math/algebra You can also use wolframalpha to work problems for you if you get stuck. The website shows work and everything.
>>19317 Thanks Anon, I'd looked into it in the past maybe it's time to look back into it again. Cheers.

Report/Delete/Moderation Forms