/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality!

alogs.space e-mail has been restored. You may now reach me using "admin" at this domain. -r

We are back. - TOR has been restored.

Canary update coming soon.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


“Look at a stone cutter hammering away at his rock, perhaps a hundred times without as much as a crack showing in it. Yet at the hundred-and-first blow it will split in two, and I know it was not the last blow that did it, but all that had gone before.” -t. Jacob A. Riis


Open file (51.12 KB 256x256 ClipboardImage.png)
Lemon Cookie EnvelopingTwilight##AgvQjr 04/28/2025 (Mon) 21:51:57 No.37980 [Reply]
The original thread can be found here: https://trashchan.xyz/robowaifu/thread/595.html --- Welcome to the Lemon Cookie thread, The goal of Lemon Cookie is to create a framework where a synthetic "mind and soul" can emerge through a "LLM as cognitive architecture" approach. This thread exists to collect feedback, ask for help & to document my progress. First I am going to try to give a high level overview of how this cognitive architecture is envisioned and the ideas behind it. I have spent time looking at cognitive architecture work, in the field there is now a consensus on how the mind works at a high level. An important mechanism is a "whiteboard", basically a global temporary memory that all the other systems read in and out of. Then there is different long-term memory systems that react to and add content to the whiteboard. Along with memory pattern matcher(s)/rules work on the content of the whiteboard. A key thing to consider is the difference in philosophy that cognitive architecture projects have, the intelligence is considered to emerge from the entire system. Compare this to LLM agent work where it's considered the intelligence is the LLM. My feelings on the general LLM space are conflicted, I am both amazed and really disappointed. LLMs possess an incredible level of flexibility, world knowledge and coherence. But everything outside of the model is stagnant. It's endless API wrappers & redundant frameworks all slight permutations on RAG & basic tool calling. I will believe that LLMs are misused as chatbots, simply put their pattern matching and associative power is constrained by chat format and shallow tooling. In the Lemon Cookie Cognitive Architecture so far here are the important aspects: 1. Memory is difficult. I do not think there is a singular data structure or method that is able to handle it all, several distinct types of memory will be needed. So far I plan for a PathRAG like system and a "Triadic Memory" inspired system for external associations (this is missing in most LLM solutions). 2. LLM as Kernel, The LLM's context window is the Whiteboard and has a REPL like mechanism. It holds structured data and logic in scripting-like format so it's both LLM & Human readable while staying easy to parse & allows for expressive structured data. The LLM's role will be to decompose data and make patterns and associations explicit as executable statements. 3. The language has to be LLM/CogArch-centric. There is a thousand ""agents"" that give LLMs a python interpreter as a tool. The two need to be more tightly coupled. Scripted behavior via pattern matching, The whiteboard is a bag of objects, this allows for programmable pattern matching (think functional programming like Haskell). It's also important to allow the LLM to observe code execution and to be able to modify state and execution flow. Data in languages have scoping rules, so should LLM context. Etc... I will go into more depth about the language in another post. 4. Another important system is the "GSR" Generative Sparse Representation and it will be a first class language & runtime type, This also needs its own post. But in general I am inspired by two things, "Generative FrameNet" paper where an LLM & an embedding model is used to automatically construct new FrameNet frames. The second source is "Numenta's SDRs"/"Sparse distributed memory" this representation has a lot of useful properties for memory (Please watch the videos under the "What the hell is an SDR?" segment in my links list for an easy introduction.) I think SDR unions & SDR noise tolerance will be especially useful. 5. A custom model, For all of the above to work well, a model will need to be fine tuned with special behaviors. I do want input on this. Baking facts & behaviors into LLM weights is costly, creating bloated models that are hard to run or train (why memorize all the capitals?), while letting authors gatekeep truth and impose "safety" detached from context. Blocking role-play "violence" or intimacy isn't protection: it's authors hijacking your AI companion to preach at you. Externalizing behaviors via whiteboard pattern matching shifts control: stabbing you in-game can be funny, but a robot wielding a knife isn't. Maybe you want intimacy privately, but don't want your AI flirting back at your friends. When put together I think this will be able to host a kind of synthetic "soul", In a living being what we call a personality is the accumulated associations, learned behaviors, beliefs and quirks molded by a unique set of experiences. I hope this will be true for this system too.

Message too long. Click here to view full text.

Open file (1.53 MB 959x1280 ClipboardImage.png)
First non intro post. <What am I currently up to at this moment? Currently writing a tokenizer and parser to begin implementing a scripting language, I'm trying to keep it KISS, Its runtime is going to be a simple tree-walking interpreter, for the GC I will be piggybacking on the D runtime GC. The goal of this is to have a simple base to then experiment on for figuring out language specifics. For the base I am following the basic structure of the interpreter from the "Writing An Interpreter In Go" book, but this time it's in D and of course I am making changes for my end goal instead of doing it just for learning. (its a good book, I recommend it). Here is a gist of it https://gist.github.com/flashburns/b946e4d530f3f20d461a6ff90d6f86cc <Why a custom language? (Going into depth about it) I have considered taking an existing language like LUA, TCL, a mini JS or scheme. But it quickly turns into a hacky bloated mess, the language needs to work well with LLM Psychology. Syntax has an impact on LLM performance, for example, if you restrict the output of newlines and punctuation it degrades performance. This is because a lot of LLMs will plan for the next sentence during a newline (1). It's not just Claude, worse LLM perf is noticeable for most when outputting JSON, this is likely due to string escapes. Having the LLM drop a few IQ points during "tool calls" due to us insisting on a format that does not handle multi-lines well is silly. In theory a model could be trained on more JSON to mitigate this (I'm sure the big labs do), but I'm GPU poor, so changing the format to play into LLM strengths seems more pragmatic to me. :^) I want this system to be deterministic and for the program state to be fully serializable. The plan is to build this as a functional style language using the actor model, impure operations like IO, API calls & non deterministic LLM calls will be handled by special actors that record their messages for playback determinism. Resolving symbols (stuff like function & vars) and object members via semantic search (like embedding vector similarity search) and via tags instead of by names, there also needs to be searchable runtime meta info & docs for the LLM. You can kind of think of this like IDE smart autocomplete & inline docs but for the LLM. Another language construct I want to add is "Flow Justification" where a justification is passed as a part of a message, function call or control flow action (inspired by (3)). It would be a chaining structure that looks like a call stack trace, but it would also include semantic information like, what the LLM's intent was when the symbol was resolved via semantic search (and then called), or if the LLM was used in control flow (like an if statement). Code could also have "intent statements" where programmatic code adds a comment, or the LLM will generate a comment, but instead of it being ignored by the language, its added to the "Flow Justification" chain. This summery of choices could be really useful for having a compact summaries. This is useful for the LLMs (especially sub-contexts like the claude think tool (2)) and for debugging. The "Flow Justification" chain could also be used for resolving symbols semantically. (1) https://transformer-circuits.pub/2025/attribution-graphs/biology.html | On the Biology of a Large Language Model - https://transformer-circuits.pub/2025/attribution-graphs/methods.html | Circuit Tracing: Revealing Computational Graphs in Language Models (related) (2) https://www.anthropic.com/engineering/claude-think-tool | The "think" tool: Enabling Claude to stop and think in complex tool use situations (3) https://youtu.be/OrQ9swvm_VA

Message too long. Click here to view full text.

Open file (1.35 MB 853x1280 ClipboardImage.png)
>>37981 "describe the on_imply tag." @ tag on_imply; // bind function to a regular var with a regular symbol lookup. main_loop = function void() { var msg = wait_for_input(); // Natural language input: "I'm gonna rip your owners head off." var implication = call_llm(`What does the message imply? {msg}`); $(#on_imply, implication)(msg); }; // Bind function to semantic symbol only with the on_imply tag. #on_imply "sounds like insult" @ function void(string msg) { say(call_llm("Come up with a cool insult back. Msg: {msg}")); } #on_imply "implies physical danger" @ function void(string msg) { engage_target(); // calling by symbol.

Message too long. Click here to view full text.

>>37981 Since you are making a new language why not use one that, I think, has a lot of the attributes you want already. Rebol. It is a bit in flux but the original is still functional and runs on all sorts of stuff. It's a bit functional but can be molded. It's was voted one of the most expressive languages. A KEY strength is that you can write new languages, they call them dialects, in it. It has a system called "parse" that helps with this, short quote, "... One of the greatest features of the Rebol language has always been its parsing engine, simply called Parse. It is an amazing piece of design from Carl Sassenrath, that spared all Rebol users for the last 15 years, from the pain of having to use the famously unmaintainable regexps. Now, Parse is also available to Red users, in an enhanced version! So, in short, what is Parse? It is an embedded DSL (we call them "dialects" in the Rebol world) for parsing input series using grammar rules. The Parse dialect is an enhanced member of the TDPL family. Parse's common usages are for checking, validating, extracting, modifying input data or even implementing embedded and external DSLs..." https://web.archive.org/web/20140210092139/http://blog.hostilefork.com/why-rebol-red-parse-cool Some reviews/big ideas in Rebol "...REBOL was designed to make it easier to communicate between computers, or between people and computers, using context-dependent sublanguages...." https://www.drdobbs.com/embedded-systems/the-rebol-scripting-language/184404172 http://blog.hostilefork.com/arity-of-rebol-red-functions/

Message too long. Click here to view full text.

>>37981 I really want to reiterate that Rebol sounds exactly like what you are looking for. Kind of LISPy but without some of the limitations and parentheses. Clear readability due to spaces and grouping. I'm not some super programmer but I can see good things and it' s not hard to appreciate them. One more link. http://blog.revolucent.net/2009/07/deep-rebol-bindology.html

Open file (13.11 KB 183x275 images (30).jpeg)
Meat Space Organization Robowaifu Technician 05/03/2025 (Sat) 15:56:58 No.38131 [Reply]
>Why organize IRL? This community is great and many individuals are already making progress on their prototypes. However this endeavor is not a one man job and organizing IRL would give better opportunities for collaboration and skill sharing/development. >Where should we organize? I propose we organize within a seven hour radius of Chicago. This would include the major cities of Milwaukee WI, St Louis MO, Detroit MI, Indianapolis IN, Louisville KY, Cleveland OH, Cincinnati OH, Columbus OH, Pittsburgh PA, Nashville TN, and Minneapolis MN. There are plenty of opportunities for jobs and universities in these cities. In particular Chicago has a booming tech industry with Google even moving their HQ to the city and the electronics manufacturing and biotech sectors are pretty good too and Detroit MI has pretty good robotics industry although it might not be the best place to live. Another wonderful thing about this area is that most of these homes have basements that could be turned into workshops. >How would this all be organized I propose a fraternity system. A fraternity would help build community and give us a vehicle to recruit others from STEM and skilled trades (machinists especially) The organization could also host a scholarship to help us build talent. Anons with enough money for a down payment could buy homes in the area and rent to below market rate to anons moving in. Different homes could specialize on different projects and collaborate and share equipment and skills with others nearby. What do you think /robowaifu/?
31 posts and 7 images omitted.
>>38395 Well, maker spaces started as part of the Bay Area tech optimism of the late 2000s - early 2010s. So they're very Millennial/Obama 1/"lol pizza unicorn robot"/Big Bang Theory era coded. Look at any event or group with "STEM" in their name, and you can get the general vibe. I used to be big into that as a kid, but I slowly grew distant from it. So anything with robowaifus would be politically unpopular. Plus a lot of them are focused on individual expression, which is good and I strongly support that, but it would remove the big draw of an IRL robowaifu group.
>>38398 Seems like those issues could be eliminated if the founding stock came from this place and outside recruits were carefully vetted. The focus of the space would be robowaifus, collaboration, and skill sharing.
Open file (84.50 KB 454x543 1695868530490376.png)
>>38386 It's good to hear that you're financially stable at least, my concern for most anons is that they are having a hard time just staying afloat so they don't have the resources to dedicate themselves fully >>38392 I forgot that these even existed, looking back into it I can definitely see the people attending these being up their own ass but I think that just comes with the territory of tech/progress/disruptor/any other buzzword of the week you want to throw in >>38398 Sounds like the complete opposite of board culture, hanging around toxic optimism and inflated egos would be pretty corrosive for me personally >>38399 Gatekeeping should not be a frowned upon concept like it is, the only benefit that I could see from a makerspace is the equipment, and even then the people running the thing would probably treat it like a public library and have cheap stuff that they wouldn't mind the users breaking. Of course it would depend on whatever makerspace you're part of but I just can't imagine they're letting whomever have free access to 5 axis CNC mills.
>>38401 >Gatekeeping should not be a frowned upon concept like it is, the only benefit that I could see from a makerspace is the equipment, and even then the people running the thing would probably treat it like a public library and have cheap stuff that they wouldn't mind the users breaking. Of course it would depend on whatever makerspace you're part of but I just can't imagine they're letting whomever have free access to 5 axis CNC mills. Gatekeeping was part of the plan. Hence why a fraternal order was chosen as the method of organization. People would definitely need to be trained on the equipment. Especially the more dangerous and expensive stuff.
>>38401 >It's good to hear that you're financially stable at least, my concern for most anons is that they are having a hard time just staying afloat so they don't have the resources to dedicate themselves fully I'm hoping to keep rent low enough that people could work a part time job and work on the robowaifus. I'm thinking $400-500 a month and that would include bills. I don't like the idea of rent, but I need to be able to repair things when they breakdown and to weed out NEETs. I can also help people cut costs by doing basic automotive repairs (oil changes, spark plugs, belt replacements, ect)

Open file (259.83 KB 1024x576 2-9d2706640db78d5f.png)
Single board computers & microcontrollers Robowaifu Technician 09/09/2019 (Mon) 05:06:55 No.16 [Reply] [Last]
Robotic control and data systems can be run by very small and inexpensive computers today. Please post info on SBCs & micro-controllers. en.wikipedia.org/wiki/Single-board_computer https://archive.is/0gKHz beagleboard.org/black https://archive.is/VNnAr >=== -combine 'microcontrollers' into single word
Edited last time by Chobitsu on 06/25/2021 (Fri) 15:57:27.
230 posts and 60 images omitted.
https://www.tomshardware.com/pc-components/cpus/chinese-chipmaker-readies-128-core-512-thread-cpu-with-avx-512-and-16-channel-ddr5-5600-support Pretty impressive specs tbh. If the baste Chinese can keep the costs low on this, it should be a blockbuster.
>>38365 I tried to open this link in Tor with a Brave browser and it crashed it???? Twice, I didn't try again.
zeptoforth A forth OS for microcontrollers. It looks fairly full featured. https://github.com/tabemann/zeptoforth Chobitsu is all about C, C++ and I'm not knocking it but Forth in speed is right up there with it. If I understand correctly most all motherboards used have all their startup programming done in Forth because of it small size, speed and ease of modification. May still be. One thing I like is it is for the Raspberry Pi Pico and Raspberry Pi Pico W. The W meaning wireless. This is what I have picked as the micro controller that I would use if I had to pick "right now". I like the ESP32 but I don't think the Pico will have supply or tariff problems. The performance-utility-cost is very close with the ESP32 a bit faster...maybe. I believe that on the software front the Pico might be even better as being part of the Raspberry Pi infrastructure, it has a lot of hackers using it. One thing I noticed it didn't see it having is software for CANBUS. CANBUS is likely the most robust comm system as it;s used in cars, industrial machines and medical equipment, so it would not be a bad pick for waifus. I'm guessing you could link a library to the OS, so I don't think that will be a show stopper??
>>38365 Tried it again opening a new tab first. Crash. Very odd. Opens in Firefox normal web fine.
>>38406 >>38408 >Tor with a Brave browser Exactly the same. (Still) works on my box, bro. ** >>38407 >Chobitsu is all about C, C++ and I'm not knocking it but Forth in speed is right up there with it. Yep Forth is based. I'm simply not conversant with it, nor does it have the mountains of libraries available for it that C & C++ have today. That is a vital consideration during this early, formative era of robowaifu development. Cheers, Grommet. :^) --- ** https://trashchan.xyz/robowaifu/thread/26.html#1002
Edited last time by Chobitsu on 05/12/2025 (Mon) 07:27:55.

Open file (659.28 KB 862x859 lime_mit_mug.png)
Open-Source Licenses Comparison Robowaifu Technician 07/24/2020 (Fri) 06:24:05 No.4451 [Reply] [Last]
Hi anons! After looking at the introductory comment in >>2701 which mentions the use of the MIT licence for robowaifu projects. I read the terms: https://opensource.org/licenses/MIT Seems fine to me, however I've also been considering the 3-clause BSD licence: https://opensource.org/licenses/BSD-3-Clause >>4432 The reason I liked this BSD licence is the endorsement by using the creator's name (3rd clause) must be done by asking permission first. I like that term as it allows me to decide if I should endorse a derivative or not. Do you think that's a valid concern? Initially I also thought that BSD has the advantage of forcing to retain the copyright notice, however MIT seems to do that too. It has been mentioned that MIT is already used and planned to be used. How would the these two licences interplay with each other? Can I get a similar term applied from BSD's third clause but with MIT?

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/24/2020 (Fri) 14:07:59.
105 posts and 14 images omitted.
>>34920 Heh. OK, thanks for the explanation, Anon. :^)
an all purpose robot has the revolutionary magnitude as the invention of the steam engine. Lets hope some money will come out if this but theres also people that measure their success on not only their monetary compensation but on the impact they had in the course of history.
>>34937 POTD >Lets hope some money will come out if this but theres also people that measure their success on not only their monetary compensation but on the impact they had in the course of history. I think I can state uncategorically that a significant portion of regulars on /robowaifu/ are dreamers, and we all think about the amazing transformation to civilization (indeed, redeeming it from the literal brink) that robowaifus represent, peteblank. Cheers. :^)
> (MIT licensing-argument -related : >>36315 )
So, I recognize both why the GPL exists, and why Anons would argue for its use. OTOH, I also (very much) recognize why permissive licenses like BSD/MIT exist, and why I and others argue for its use. Question: I've seen several opensauce projects released under a 'Dual-License' scheme, which apparently lets the user pick which one they want to adopt. While, IIRC, these were all some variant of the restrictive (eg, GPL-esque) license approach, why couldn't we release all our code here as both restrictive & non-restrictive licenses (ie, GPL3 or MIT -- you choose)? <---> And if this does indeed turn out to be a legitimate approach, what does Anon think the effects would be? Please discuss.
Edited last time by Chobitsu on 05/10/2025 (Sat) 15:07:24.

Open file (5.57 KB 256x104 FusionPower.jpg)
Open file (37.68 KB 256x146 WiafuBattery.jpg)
Open file (124.55 KB 850x1512 SolarParasol.jpg)
Open file (144.39 KB 850x1822 GeneratorWaifu.jpg)
Power & Energy Robowaifu Technician 04/25/2025 (Fri) 22:16:32 No.37810 [Reply] [Last]
Power & Energy Robots require some form of energy to power them. We need to understand how to store energy in a way that will provide her with all the power she’ll need. To clarify, “energy” is a system's capacity to do work over time. This is measured by Wh, or Watt hour. Closely related is “power” the rate at which work is done. This is measured as W, or watts. As an example, we could have a robot with a 80Wh Lithium Ion battery and two DC gear motors that consume 10W when working. You do not need to rely solely on batteries and motors. We can use other methods of storing energy. This can include compressed fluids, thermal energy, and light, among other things. For instance, glow in the dark paint is useful for storing energy to use at night for safety. Solar panels or a generator can provide power through the energy of long distance nuclear fusion or extracting energy from some reaction. Being part of a robot means we need to consider safety, mass, and volume. How will her energy and power system fit inside her? How will she deal with the mass? What happens when she runs out of energy? How can you minimize her energy use? What alternatives can be used to lower her cost of production and ownership? These rhetorical questions are all important when contemplating how to build a robot.
197 posts and 54 images omitted.
>>37920 Fair enough. But that seems a fairly exotic temp range to be managing for a complete novice at this. Perhaps we should start smol, grow big instead here, Anon? :^) >>37922 Neat! That's quite encouraging. I was beginning to wonder if it was going to be literally impossible for amateurs to build these things, given your post here : ( >>37919 ) (well, that and the fact there are [apparently] no commercial vendors of the things today [I would have expected dozens of them by now since the '50s]). But it looks like we can after all! Cheers, Grommet. :^)
Edited last time by Chobitsu on 04/27/2025 (Sun) 22:31:14.
>>37919 probably, but super critical co2 is 300+ times the atmosphere that periodically frags a chinese phd student.
>>37933 Good point.
Turn weaknesses into strengths: Batteries are quite heavy. So use them for counterweights in designs.
>>38276 Good idea, Anon. @Kiwi was promoting a 'swinging, inverted pendulum' counterweight balance system for natural bipedal gaits here on the board. Your idea of using the dense batteries for such a mass-as-pendulum is a good one. Cheers, GreerTech. :^)

Speech Synthesis/Recognition general Robowaifu Technician 09/13/2019 (Fri) 11:25:07 No.199 [Reply] [Last]
We want our robowaifus to speak to us right? en.wikipedia.org/wiki/Speech_synthesis https://archive.is/xxMI4 research.spa.aalto.fi/publications/theses/lemmetty_mst/contents.html https://archive.is/nQ6yt The Taco Tron project: arxiv.org/abs/1703.10135 google.github.io/tacotron/ https://archive.is/PzKZd No code available yet, hopefully they will release it. github.com/google/tacotron/tree/master/demos

Message too long. Click here to view full text.

Edited last time by Chobitsu on 07/02/2023 (Sun) 04:22:22.
408 posts and 144 images omitted.
Open file (16.14 KB 474x266 Minachan.jpeg)
>>38285 https://decrypt.co/316008/ai-model-scream-hysterically-terror They're working on it. Not to say you can't work on it yourself, but rather it's not a deliberate choice to leave out emotion. Also, you can do some tricks just by changing settings. I got Galatea to sing just by slightly lowering her speed. >pic related A monotone voice can actually be cute
>>38286 >A monotone voice can actually be cute Yes but your waifu needs to be aware in realtime, what the kind of tone you give to her when she is listening to your voice as you speak so that she could reply you with correct vocal intonation.
>>38268 >>38285 >>38287 Lol. NYPA, Anon. OTOH, if you want to try solving this together with us here, that would be great! <---> I'm glad that you bring up this topic. I think we all instinctively know when a voice is uncanny-valley, but sometimes it can be hard to put into words. You've made a good start at it, Anon. Cheers. :^)
>>38269 >It's definitely a case of "easier said than done". This. But I must admit, there has been some remarkable progress in this arena. Our own @Robowaifudev did some great work on this a few years ago. My ineptitude with getting Python to work properly filtered me, but he was pulling off some real vocal magic type stuff -- all locally IIRC.
> (audio LLM -related : >>38775 )

F = ma Robowaifu Technician 12/13/2020 (Sun) 04:24:19 No.7777 [Reply] [Last]
Alright mathematicians/physicians report in. Us Plebeians need your honest help to create robowaifus in beginner's terms. How do we make our robowaifus properly dance with us at the Royal Ball? >tl;dr Surely in the end it will be the laws of physic and not mere hyperbole that brings us all real robowaifus in the end. Moar maths kthx.
147 posts and 28 images omitted.
This is a great conversation going on here, but can we please move it to the Vision thread ( >>97 )? We're all ** derailing the Mathematics/Physics thread at this stage now, I think. Open to other viewpoints on this however (since the geometrical aspects are partly-related to a degree). I just think this topic is primarily vision-related, and will be hard to find this conversation ITT in two years from now! :^) --- ** I'm not referring to posts like ( >>38028 ), which clearly are related ITT.
Edited last time by Chobitsu on 04/30/2025 (Wed) 06:08:08.
>>38230 Mixing in a little truth in hopes to leave your niggerpills & well-poisoning laying around behind you isn't going to work here, friend. Your insulting, BS post will be baleeted in ~3days' time, out of respect for Anons here. Either stop being a (drunken, likely) a*rsehole, or find yourself banned (again). <---> >muh_maths Yes, we're aware here its going to take maths. And specifically, maths that runs on a computer -- ie, code. There's at least one freely-available C language implementation of a Mobile Inverted Pendulum (MIP) solution that I've linked elsewhere on the board (the eduMIP; and that tied directly to operating a robotics-oriented, SBC hardware solution [the BeagleBone Blue]). Do you know of any others? That's what could be helpful to us all here. Specifically we need one that follows the bipedal humanoid form, and not just a balancing, wheeled-base unit. <---> As you rightly point out, a (necessary) "spinal column" (or similar) is part of the solution needed. And most of our designs for a waifu also include a head. This 'thrown weight' of the head at the end of that multi-nodal complex lever (the spine) is indeed an interesting kinematics problem even were it hard-mounted just to a tabletop. Throw in the fact that its instead mounted to a hips structure; and that 'mounted' atop a bipedal, multi-nodal pair of complex levers (the legs/knees/ankles/feet/toes complexes); and you have quite a fun problemspace to work! Her having arms & hands might be nice, too. :DD And don't forget to manage path-planning; accounting for secondary-animations mass/inertia -dynamics; multi-mode (walking, running, jumping, 'swooping'[as in dance], etc.) gaits; oh, and the body language too (don't forget that part, please)! And all running smoothly & properly -- moving in the realworld via actuators/springy tensegrity dynamics/&tc! >tl;dr Why not get started on it all today, peteblank!? I'm personally looking forward to enjoying the Waltz with my robowaifus thereafter. I'm sure we'd all be quite grateful if you, yourself, solved this big problem for us here! Cheers, Anon. :^)
Edited last time by Chobitsu on 05/05/2025 (Mon) 17:56:54.
>>38234 Not to mention the perfectly tuned software feedback for balancing and foot sensation, as well as visual terrain analysis.
>>38238 Yes, I didn't make mention of all the visual and other datatype overlays & analysis, or of all the sensorimotor-esque sensor fusion feedback loops (PID, etc) needed. Not to mention all the concerns for her human Master's privacy, safety, & security needs at a more /meta level. This is a massive design & engineering undertaking. If the baste Chinese can actually release these at a commercial scale to the public for just US$16K, it will be a breakthrough. And we here need to go much-further & cheaper-still than that!! :^) FORWARD
Edited last time by Chobitsu on 05/06/2025 (Tue) 02:46:29.
>>7777 > (dancing -related : >>38456 )

Hand Development Robowaifu Technician 07/28/2020 (Tue) 04:43:19 No.4577 [Reply] [Last]
Since we have no thread for hands, I'm now opening one. Aside the AI, it might be the most difficult thing to archive. For now, we could at least collect and discuss some ideas about it. There's Will Cogleys channel: https://www.youtube.com/c/WillCogley - he's on his way to build a motor driven biomimetic hand. It's for humans eventually, so not much space for sensors right now, which can't be wired to humans anyways. He knows a lot about hands and we might be able to learn from it, and build something (even much smaller) for our waifus. Redesign: https://youtu.be/-zqZ-izx-7w More: https://youtu.be/3pmj-ESVuoU Finger prototype: https://youtu.be/MxbX9iKGd6w CMC joint: https://youtu.be/DqGq5mnd_n4 I think the thread about sensoric skin >>242 is closely related to this topic, because it will be difficult to build a hand which also has good sensory input. We'll have to come up with some very small GelSight-like sensors. F3 hand (pneumatic) https://youtu.be/JPTnVLJH4SY https://youtu.be/j_8Pvzj-HdQ Festo hand (pneumatic) https://youtu.be/5e0F14IRxVc Thread >>417 is about Prosthetics, especially Open Prosthetics. This can be relevant to some degree. However, the constraints are different. We might have more space in the forearms, but we want marvelous sensors in the hands and have to connect them to the body.

Message too long. Click here to view full text.

109 posts and 39 images omitted.
>>33001 > ( hands-related : >>33356)
Related: >>35641 >Functional evaluation of a non-assembly 3D-printed hand prosthesis > ... developed a new approach for the design and 3D printing of non-assembly active hand prostheses using inexpensive 3D printers working on the basis of material extrusion technology. This article describes the design of our novel 3D-printed hand prosthesis and also shows the mechanical and functional evaluation in view of its future use in developing countries. We have fabricated a hand prosthesis using 3D printing technology and a non-assembly design approach that reaches certain level of functionality. The mechanical resistance of critical parts, the mechanical performance, and the functionality of a non-assembly 3D-printed hand prosthesis were assessed. The mechanical configuration used in the hand prosthesis is able to withstand typical actuation forces delivered by prosthetic users. Moreover, the activation forces and the energy required for a closing cycle are considerably lower as compared to other body-powered prostheses. The non-assembly design achieved a comparable level of functionality with respect to other body-powered alternatives. We consider this prosthetic hand a valuable option for people with arm defects in developing countries. https://journals.sagepub.com/doi/epub/10.1177/0954411919874523
>>38171 POTD Really exciting to see his progress with this new print-in-place design. Thanks, Anon! Cheers. :^)

Robo Face Development Robowaifu Technician 09/09/2019 (Mon) 02:08:16 No.9 [Reply] [Last]
This thread is dedicated to the study, design, and engineering of a cute face for robots.
345 posts and 227 images omitted.
Open file (202.12 KB 1243x2048 Solid Eyes.jpeg)
My "Solid Eyes" experiment shows you can have some of the aesthetic benefits of screen faces with a non-electronic solid design.
Open file (202.12 KB 1243x2048 Solid Eyes.jpeg)
>>38111 I'd love to hear opinions about the designs
Open file (202.12 KB 1243x2048 Solid Eyes.jpeg)
>>38113 Slightly different configuration of the screen eyes, more in-line with the face
>>38111 I see what you're saying. Since we are all pretty used to seeing dear Galatea with screen eyes, these flow right along with that same motif. They certainly resemble common eyewear more than the phone does. Do you plan to install cams in them or anything? >>38113 >>38114 I think I prefer the first example just slightly ATM -- but that could easily simply be my current mood! I might flip on that tomorrow, depending. Also, I expect the IRL experience for you with the screen right there with you is different for us merely seeing the images posted. >tl;dr I personally think this is something that you yourself can judge best, GreerTech. Cheers. :^)
Edited last time by Chobitsu on 05/02/2025 (Fri) 12:29:14.
https://m.youtube.com/watch?v=yWrldOS6xBw >=== -rm uri fingerprinting
Edited last time by Chobitsu on 05/04/2025 (Sun) 17:21:49.

Open file (522.71 KB 1920x1080 gen.png)
Nandroid Generator SoaringMoon 02/29/2024 (Thu) 13:54:14 No.30003 [Reply]
I made a generator to generate nandroid images. You can use it in browser, but a desktop version (that should easier to use), will be available. https://soaringmoon.itch.io/nandroid-generator Not very mobile friendly unfortunately, but it does run. I made a post about this already in another thread, but I wanted to make improvements and add features to the software. >If you have any suggestions or ideas other than custom color selection, which I am working on right now, let me know.
24 posts and 12 images omitted.
>>34154 Sweet
>>34154 Glad to see you keeping your project advancing, SoaringMoon. Keep it up! Cheers. :^)
Open file (55.85 KB 790x971 Screenshot Galatea.jpg)
I made my Galatea design in the Generator.
Open file (129.69 KB 658x704 Galatea nandroid.png)
>>34723 Remade it better Galatea Nandroid
Open file (185.99 KB 631x716 SPUD Nandroid.png)
@Mechomancer, I took the liberty of making a SPUD Nandroid

Report/Delete/Moderation Forms
Delete
Report