/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


F = ma Robowaifu Technician 12/13/2020 (Sun) 04:24:19 No.7777
Alright mathematicians/physicians report in. Us Plebeians need your honest help to create robowaifus in beginner's terms. How do we make our robowaifus properly dance with us at the Royal Ball? >tl;dr Surely in the end it will be the laws of physic and not mere hyperbole that brings us all real robowaifus in the end. Moar maths kthx.
From this page https://docs.sel4.systems/projects/sel4/frequently-asked-questions "...To the best of our knowledge, seL4 is the world’s fastest microkernel on the supported processors...64-bit RISC-V kernel is about 9,400 SLOC...Presently seL4 runs on Arm v6, v7 (32-bit) and v8 (64-bit) cores, on PC99 (x86) cores (32- and 64-bit mode), and RISC-V RV64 (64-bit) cores..." NICE! I wonder if t will run on ESP32 micro-controllers. If so it would be SO IDEAL. Even of it doesn't you could use serial comm to talk to them.
This is great stuff Grommet. Thanks for all the research. If you feel strongly about this as a OS platform, then I'd think it's in our collective best interests here to investigate it thoroughly in a practical way?
>>20481 >One BIG problem is that they seem to be trying to bury all the code from these micro-kernal L4 project Ok I'm wrong about this. I looked at all this L4 stuff a few years ago. Several years ago and when I looked recently a bunch of it seems to be gone but the seL4 is current. The guy who came up with the code that made L4 possible died. (Likely another one of those "died suddenly" that we see so much of) It may very well be that a lot of his projects died with him and the sites went down. I've looked at the L4 stuff for along time but not recently. I want to add if you are interested in robowaifus you're going to need micro-controllers for input and output. There's this one called ESP32 that is really the swiss army knife of micro-controllers. here's some comments I made on them >>12474 >>18902 >>18778 >>12480 Here I do same math on cost to build with these micro-controllers. >>13408 BTW here's a paid OS for ESP32 microprocessors. https://mongoose-os.com/mos.html
>>20507 I'll check them out. But I'd rather some other anon specializes in microcontrollers and OSes. I'm already deep into the AI part and I'm not sure if I have enough time to spare to learn something completely new. If we are to make proper robowaifus, we need different specializations working together, instead of everyone becoming jack of all trades, master of none. btw we'll be solely using RISCV in our robowaifu microcontrollers right? I wouldn't trust closed source ARM and x86 and iirc the MIPS creators have moved on to support RISCV.
>>20506 >investigate it thoroughly in a practical way? Part if this is I read a lot of stuff because this sort of thing interest me. It may very well be that there are big problems that that are not readily apparent on the surface. I'm trying to point people to stuff I've seen that "seem" to work but there's no doubt that I could miss a lot of others that could be better. This seL4 looks really good though. It has definitely been used for major systems like missiles and planes and stuff like that and the source is available. That being said none if this stuff is really easy. The Boston Dynamics people worked on this stuff for many years. Fake dog and fake oxen. I suspect that these guys, BD, coded everything into a big wad of code with all this very specific motion code. I can't say I know for sure but I "think" that if we we were to make some basic movement type code, say a rough outline of movement, and then run AI so that it learns to walk, I bet it would be faster and less computational dense. A lot faster and cheaper. People here have said that has been done and it didn't work. Maybe if it could watch itself and then have it reference an actual human walking to correct itself as it learned???? Not easy. If you can build a waifu, you could also build a exoskeleton and that could be used to program the waifu.
>>20510 does BD ever plan to implement some kind of AI or other adaptabilitycode in their robots? Otherwise, their Spot robots that they plan on selling to the police will not take off. Nor will their other robots, I forgot the name. Scripted obstacle courses and dances can only take you so far. I'm so frustrated with BD, they're the only real competition to Tesla's Optimus. Unless Honda brings out ASIMO 2.
Here's using a ESP32 for face recognition. I don't know if the code is AI or not. https://randomnerdtutorials.com/esp32-cam-video-streaming-face-recognition-arduino-ide/ Enough, I'll stop filling up comments now.
>>20512 >BD I have no real knowledge of what they are doing but look at their stuff. It looks like they programmed in all this motion stuff with physics and all of that. Or to me it does. I think that path is a dead end. But what do I know I'm just some guy on the internet.
>>20513 >Enough, I'll stop filling up comments now. No, don't stop lol. Everything you've ever posted here has been useful information Grommet, some very-much so. :^)
>>20514 exactly what I said. They've already hit a dead-end imo. How many different real-world scenarios can they hardcode into their robots? They really better start investing in AI. Hopefully, poach some talent from Tesla guys working on Optimus. While the Optimus robot generally felt like a sore disappointment, I thought the AI and vision part was pretty good. I'd like to see it work in BD robots.
>>20512 >I'm so frustrated with BD I mean I think I understand your postition, Anon. But frankly, I see any fumbles by the big players as feeding into our own success here on /robowaifu/ and our related-cadres out there. More 'breathing room', as it were.
>>20525 I don't particularly care who gets to wroking humanoid robots first. I see all advances as a win. Besides, even their wins would eventually trickle down to DIY anons. I'd definitely buy a BD robot, take it apart to see how they made it, then make my own.
>>20524 >While the Optimus robot generally felt like a sore disappointment I predict Tesla will soon shoot far past any other player is this arena anywhere in the midterm timeframe. We can be thankful they aren't targeting the robowaifu market! :^) >I thought the AI and vision part was pretty good. It is interesting. But not too surprising IMO, after all, it's literally the same AI board they use in their auto-driving cars. >=== -minor edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 10:55:09.
>>20526 Ah, I see. Well, I understand the perspective at least. However I personally feel morals impinging around this entire domain that make it vital, I feel, that we get robowaifus out there as a real cottage industry well-before the big players invade that niche and manipulate the systems to make anything but their own globohomo-approved mechanical c*nts illegal. History shows us all time and time again their slimeball tactics in this manner. >=== -prose edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 10:54:01.
>>20527 Honestly, as much as I hate Elon, I think if anyone's going to start selling robowaifus and disregard the mainstream outrage, it's Musk. >>20528 Thing is, it's much easier for the world governments to start banning robowaifus when it's just a small DIY scene. But, when you got megacorps who would lobby billions, its much harder to ban. The more robowaifus proliferate, both in the DIY scene and in large corporations, the harder it will be for them to ban. And I don't think any megacorp will actually advocate banning robowaifus. Their only ideology is their bottom line and robowaifus would potentially a trillion dollar industry.
>>20529 >I think if anyone's going to start selling robowaifus and disregard the mainstream outrage, it's Musk. I too see him creating new companion lines once we here succeed at it first. He clearly is targeting his own factories first and foremost, and then specialized labor uses. Then he'll likely move into medical/patient care. During all that time, we'll be working on perfecting robowaifus, ofc! :^) >>20529 I disagree on both points. Since we are open-saucing everything here, it's a genie out of the bottle. Since the East will go banging with this quickly, the Globohomo will never be able to stop it. Secondly, I feel you err in your estimate that somehow the 'megacorps' and the government are two distinct entities. They haven't been for a long time. That's why their full, formal title is The Globohomo Big-Technology/Government. And it's also the odd state of affairs that the tech-tail is wagging the beltway-dog, very clearly. >=== -minor edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 11:08:18.
>>20530 I do think they're working parallelly on adapting Optimus to waifu duties and they'll release it a few years after the factory robots. And I do not believe the Globohomo to be one huge monolith with a single goal. There are different factions with competing interest within them, hence why you can find governments, companies often coming into conflict. Among them, I believe companies to be more shortsighted and focused on profits than some world government/global control scheme. They're also the ones who fill the pockets of politicians. They can definitely see the potential profits in offering a robowaifu, especially in this day and age with billions of lonely men.
>>20531 >I do think they're working parallelly on adapting Optimus to waifu duties NIce. But they'll have to create something on an entirely-different frame geometry (which will require re-optimizing all the articulations & actuation codes). Fair enough about your estimates on the Globohomo. I might have time & energy to debate this topic later. Speaking of my energy, we're well off-topic ITT. Any further discussion please move it to somewhere else like /meta or news. It's a tedious, error-prone process copy-pasting each post one-by-one, with a new post each over into it's proper thread, by hand, and then deleting all the originals; but sadly that's exactly what I have to do each time I 'move' a conversation over to another thread. I'd like to 'cut the saying short' if you take my meaning! :^) >=== -prose edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 05:04:28.
I also should mention the micro-controllers will need an OS to lesson the burden of writing all this super complicated timing stuff in OS's. Really hairy stuff. A good open source that works a huge amount of micro-controllers, including my favorite the ESP32 is FreeRTOS. Real time so it will be responsive. It has good expansion features for legality. How long before they add in safety rules and regs? Not long. It MIT license so you free to do as you please but it has, if you pay, verified guarantees. There seems to be a good deal of documents for it and other libraries to use. Code for RISCV and ARM and others. This means we can write code and use whatever micro-controller w can get at the lest cost highest performance. https://www.freertos.org/index.html I don't think this will work for microcomputer processors but I'm not sure. Be nice if it did then we could use more powerful RISCV for processing speech, visual, etc. while using the same OS everywhere. Less to learn. I'm not saying I know how to do this but my thinking is if we could use all these micro-controllers for input output to mostly walk and move around but ALSO use the fairly large computing power they have built in to do processing also. So say the robowaifu wants to clean something or do something complicated it could stop moving mostly and use a little of it's micro-controller distributed computing power to aid in whatever task it was concentrating on. Much like humans. When they concentrate they slow down and think. I think as I said before something like seL4 will likely have to be used for the main processor for speech, understanding and navigation. There's something I talked about earlier that I think is real important. In order to keep cost low we will have to use some sort of off the shelf micro-controller. The ESP32 I like so much has a large amount of inputs (they can read capacitive sensors for touch) and a large amount of outputs. Ideal. Now instead of building some contrived output board instead we use these outputs built in (driving transistors or more likely MOSFETs) AND the big win is these things have a a lot of computing power. So we have all this computing power and if we can share this between the various micro-controllers then it may very well be that for basic moving around, not bumping into things we could just use the built in computing power of these things and not have any main processor at all. Later for speech or higher functions we could add a fast RISCV microprocessor and link to the other controllers. What kind of power we talking about. I wrote before, "...You can get them for less than $9USD... So at 300 muscles/16 PWM output channels per micro-controller, means we need 19 and at $9 each=$171 But with that comes 19 x 600 =11,400 DMIPS. DMIPS is basically that many integer million instructions per second. It's a lot. 11.4 billion total per second with 19 MC's. >>12474 Let's say we check every output and input every micro second so a 1000 times a second and it takes 10 instruction cycles to do so that leaves us 599,560,000 instructions a second to do...something with. And that's just one processor. Most things we are going to do is compare and test, is the arm here or there, has the touch sensor touched anything, etc. most of these are values you compare to some value. I don't think the average computing will be very large for that. Even if it's ten or hundred times larger we still have a hell of a lot of computing left over. I think with the right algorithms walking and moving about will take very little computing power. After all insects move about just fine and they have next to no computing power. Of course figuring out how to do this might be tricky. I bet someone, somewhere, has a paper about this but I'm not sure if I've seen it. Looking up insect movement papers might be helpful. "...Insect brains start at about 1000 neurons...".." It's a lot of power these MC have and communication is built in to these things with CAN2.0 bus comm. like they use in cars, industrial machinery and medical equipment. Now none of this is easy. Learning some operating system, writing code for micro-controllers, using CANbus commands, but think of you had to write this stuff yourself from scratch. Hopeless. Almost a lifetime job, but if you can crib through a manual and copy other people's snippets of code, cut and paste code from these specs that are used by lots and lots of people, (meaning more code in the open), maybe you could get something working. If anyone has any ideas on how to send messages over CAN2 bus for movement of actuators I would really like to hear it. You would have to send some sort of vector, meaning a direction, in or out, and a velocity which would be turned into a voltage to drive the actuator. Now you want to be able change this in the middle of movement so how to do that? Could you send a beginning vector and then send maybe a second vector? How to make these vectors coordinated so you have walking instead of jerky stuttering movement? Could you have one micro-controller control all the muscles in one limb so that you could tell the limb to move say, forward ten inches and two inches to the side and then have the computer figure out how to work these all together? Could you use some sort of AI software to do all this coordination? Coding it all by hand could take forever. What kind of AI code would you use to do this sort of thing? Lots of questions, no answers...yet.
>>20558 This is absolutely great stuff, Grommet. Thanks! >Let's say we check every output and input every micro second so a 1000 times a second I'm guessing you mean milli-second? >If anyone has any ideas on how to send messages over CAN2 bus for movement of actuators I would really like to hear it. There's RobowaifuDev's IPCNet (>>2418). Also, you can read this (>>772). My apologies I can't give the response ATM that your post deserves, Anon. Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 04:58:59.
I hate to keep changing things but I haven't looked at real time operating systems for micro-controllers in a long time and things have really changed. I found a real interesting OS called Zephyr. https://en.wikipedia.org/wiki/Zephyr_(operating_system) This looks good. It is run by the Linux foundation and is maintained. It also will run on larger faster computer processors so that's a big plus. You could run MC and your main high power processors on the same OS. Big win. Save you from having to learn more than one system. Huge advantage. Here's a comparison of some well known, I think, OS's for micro-controllers. https://micro.ros.org/docs/concepts/rtos/comparison/ I'm still looking at these to see what might be best. It's a big deal to pick one because of all the effort you will put into learning how to use it. Once there, you're not likely to change so having the right mix of ease of use and others having code you can reuse is important. While I was looking at the OS comparison link I went up to the main link and found there is such a thing as a "Robot Operating System" which uses the lower level OS's listed as a base and rides on top of it. Holy smokes there's a lot of books and data on this thing. It's also free. I'm asking all these questions about how to do all this stuff about coordination, maybe it's already done??? https://micro.ros.org/ Here's a link to a search for a LOT of books on this Robot Operating System http://libgen.rs/search.php?req=Robot+Operating+System&open=0&res=25&view=simple&phrase=1&column=def Have to look further and see what this is all about. My first impression is this is the shit. WOW! This is what's needed(if it works). There's tons of books, documents and the books say you can simulate robots. How cool is that. Get the right actuator then simulate all the movements and software before you build. You could make progress with zero money just simulation. BUT does it really work? It's very impressive marketing blurbs, but you know how these things are. I've got an idea for using robots for some dangerous work but stuff you could do yourself or two guys and maybe make some money. It wouldn't be a waifu, but all the techniques to build the equipment and operation would use all the same tools and provide, maybe , some cash. This really excites me. I may be buried in some books for the next few months.
>1000 times a second I'm guessing you mean milli-second? Oops...yes
>Car Hacker's Handbook Much thanks. I need that thread. I'm looking at robots AND drones right now. I've got an idea that I can use drones to carry ropes to the tops of trees. Attach cables, crawl or pull robots tree cutters and then trim or cut down trees. There's some money in this and my Mom has a tree that really has to have it's limbs cut. I've been debating how to trim this down. I bought a tree harness and have some ropes but I'm terrified of climbing this really tall tree. And let's not even talk about usng a chain saw while up in the tree. And I know how to use one but...it's like 75 foot high. The costs are really high to get someone to trim these and this seems the right time to try something different. It's going to fall on the house. So I'll get some tree cutting bids and see just what it will take and "try" to build some sort of tree trimming robot. In the process I'll learn all the skills needed for waifus, save my Mom's house and her some bucks. My thinking is get away from chain saws. Build something like huge garden shears like the beak of a parrot with inward cutting blades. I've talked a lot about transmissions. I think I can build one of those I talked about. Slowly close the jaws and snip trees right in two. If too big nip at it a little at a time. Good selling point if they have power then no machine noise or much less with electric. The job is dangerous as can be I know someone who used to do it. A chainsaw will tear a huge chunk in you in seconds if you let it hit you. I saw a guy hit his leg one time. DAMN the blood. It went right in his leg. There's also home owners fear of the guy killing himself and suing him so knowing you will be on the ground and using robots. I think a big selling point That's my plan over the next many months. Likely, or the plan now, is to use this kick ass sounding Robot Operating System, ESP32 and try to make it work. I can weld and have a decent knowledge of metal casting aluminium (but haven't done it), so I expect I can build whatever need and what I can't, maybe bearings, are easy to get. I have lots of reluctance motor ideas. I'll just have to build some and see what works. A PLAN!
>>20497 >over time you need an OS or what will happen is you will have a big binary pile of stuff that will never ever, ever be ported but to one specific thing and then immediately bit rot will set in and it will become useless. I think bit rot can be mitigated by something like btrfs or zfs. Also, I maybe don't understand, but if something runs on a known system you can always emulate that. Anyways, thanks for the reminder that maybe we should use some RTOS for movements at least. Ideal would be to have code that can be used on a more common system and then transferred to such a system as soon as necessary. It's likely that people will implement things in Arduino C++ or Micropython these days. >but I'm terrified of climbing this really tall tree. I did this as a child for fun. You have four limbs, it's unlikely that all of them fail or slip at the same time. Also, the branches below you would catch you while falling. Well, being slender, fit and young would help. A belt on top of it should make it very safe. >chainsaw ... I saw a guy hit his leg one time. Yeah, please don't do that.
>>20575 It's a great business idea Anon. I hope you can successfully pull this off as a plan. Maybe we should consider starting a non-robowaifu projects prototyping thread here? Certainly this tree surgeon robot would be much simpler than a robowaifu will be, yet touch on many of the similarly-needed design & engineering concepts. >The job is dangerous as can be... Yeah, please do not do this yourself Grommet. You're too valuable to our group here! >>20582 >Ideal would be to have code that can be used on a more common system and then transferred to such a system as soon as necessary. Agreed. >=== -minor edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 11:14:42.
>>20582 >I think bit rot can be mitigated by something like btrfs or zfs Maybe I'm using the wrong terminology. You misunderstand. I'm not trying to rude just more precise. It's going to take a while to get this stuff to work. Each micro-controller will have it's own glitches and exceptions. If you code for just one then very soon the latest and greatest will come out and you will be covered up in weird bugs. "If" we can use these robot operating systems the companies or the software providers will show you which micro-controllers work with the software and you can port to it easily. You're calling finctions ionstead of raw registers and assembly coding. I have very little computer programming experience. Only FORTRAN and hexadecimal assembly. Hex programming is hard and takes forever. I could do it but the time is immense. The assumption I'm making s these libraries can do a lot of the little stuff whoe we concentrate of the larger movements of the robot. All this stuff related to timing and a whole bunch of OS type housekeeping, I don't think I can do that or at least not in this lifetime. I've been looking a a little at the Robot OS and it's hard enough. The link I gave above is microROS, there's a larger Robot Operating System with way more features. It runs on Linux. So we may need a single board computer with Linux, then micro-controllers to operated the limbs, sensors etc. Networked together like nerves in a human. Let's hope the microROS, for micro-controllers, and the larger ROS are similar. I get the impression they are. I can already see this wll be harder than origninally thought. I think you will almost have to learn C to do this. I've been downloading books to do this. I have made a couple of C programs and I really hate this stuff. Programming is something that, to me, is satisfying "after" you finish but a real pain in the ass while you are doing it. As for climbing trees. When I was young I climbed plenty but never with a chain saw running nor am I young any more.
>>20629 Your writing was a bit verbose before and I got confused trying to follow the discussion here. I can't always parse the whole comment or loose the train of thought about the conversation. I think you actually made the case for using an OS. Which I support, without claiming to know much about it, if there's no strong reason against it. Otherwise, someone has to make the case why not: OS or drivers from the scratch, down to addressing the SBC? Just to make it a bit more efficient? Maybe I'm wrong but this looks like overkill or at least premature optimization. There should be some abstraction layer running on many SBCs, then we'll only need the write the code on top of it. What's again wrong with some L4 OS? Who's arguing against it? Do we even need it? Is it even necessary to have this amount of fail safety? The real world is messy, we should make our robots being able to sense and adapt to it. Not move super fast and ultra precise. >I think you will almost have to learn C to do this. Maybe something like NIM that compiles to C is good enough? I looked briefly into L4 and it seems to have it's own programming language or maybe it's just a coding style?
>>20877 >nesper >I generally dislike programming C/C++ (despite C's elegance in the small). When you just want a hash table in C it's tedious and error prone. C++ is about 5 different languages and I have to idea how to use half of them anymore. Rust doesn't work on half of the boards I want to program. MicroPython? ... Nope - I need speed and efficiency. Thanks, this might become handy. But why are we in the math thread?
>>20882 >But why are we in the math thread? My apologies to everyone here for contributing to this abuse big-time. NoidoDev, would you be willing to consider becoming involved directly in taking on the tedious tasks (>>20532) involved in cleaning our threads up with posts in their proper places? (Again, apologies for further compounding the problem with this post.)
>>20884 >would you be willing to consider becoming involved ... in cleaning our threads up with posts in their proper places? How? Trying to register as vol?
>>20901 Yes. You just have to get Robi to turn on account creation, then once you have one made, I'll assign that account as a volunteer here on /robowaifu/. Trust me, this isn't fun work, so you should give some consideration if you really want to take it on first.
Just so I can say I was not "totally" off topic. A lot of control theory, I believe, is loosely, very loosely, based on the same sort of ideas as Fourier transforms. Sine waves. I'm not a mathematician but it seems that most are based on wave forms and the math is very slow because it uses sine waves, generally. There's a set of math functions that are based on stretching a raising the peaks of waves, "Wavelets" that is far, far, faster. Djvu uses wavelets, a lot of oil prospecting seismic processing software use wavelets to tease out very fine grained underground structures from the data and movies use these functions to compress data. I've read the processing time for signal processing can be 10 times less using wavelets to analyze data, statistics, etc. It seems that using sine waves based signal processing uses far more processing steps. More computing time. Wavelets use much more of a simple add and subtract without a lot of matrix algebra. I can't help but think it may be analogous to the mass of matrix additions that AI uses now compared to the way Xnor.Ai processes AI far faster. I'm trying to grasp the big idea pattern here. It's seems that present AI (I'm going to equate to a waveform) uses a lot of matrix multiplications to go over every single speck of the data. Analyzing each and every little data point. Xnor.Ai uses a far larger gross examinations by saying, is this one clump of data I'm analyzing larger than that clump of data, and then passing the result on as yes or no. They only care about the larger coefficients when they are analyzing it. I see this as comparable to wavelet processing in a general sort of big picture way. I'm probably screwing this up but I hope that I've pointed things in a generally correct direction. https://en.wikipedia.org/wiki/Wavelet Another offshoot of this idea is a "chirplet". There's a GREAT picture of the different waves that gives you a BIG picture idea of what I'm trying, probably unsuccessfully, to convey at this link. I'll link the picture too. https://en.wikipedia.org/wiki/Chirplet_transform https://upload.wikimedia.org/wikipedia/commons/1/10/Wave-chirp-wavelet-chirplet.png Look at how the different waves could be used to represent information or analyze information. Here's my understanding of why this is a good thing. Look at first "the "wave". Think if you had to add a lot of these up like a Fourier transform it would take a lot of them to fit it into the signal we are approximating. I think the general idea is the same as successive approximation in calculus. So we add up all these waves to make it fit our actual data. Now look at the wavelet. It could stretch and raise the peaks to fit. I think this function uses less coefficients to fit into the signal, now look at the chirplet. since it it seems to already have a bit of stretch built into the function it might take even less stretching and raising of the amplitude to approximate the information waveform. I think the basic idea is that you transform the signal of whatever we are trying to analyze into a convenient waveform, wavelet, chirplet, etc. then we can use simple addition and subtraction to quickly analyze the data to tease out what is going on in this, formally, complex wave of data. This vastly simplifies the processing power needed. Now me saying this like it's some simple thing, well it's not. Figuring out "what" transform to use and how to set it up is difficult. Maybe what needs to be done is to figure out what method, transform, operation would be most advantageous for us to use. What I'm trying to do is to state what the problem "is" and how to go about solving it, not that I necessarily know the answer. And there's always the case that I have just formented a huge case of the Dunning-Kruger effect and have no idea what I'm talking about. If so please inform me this is the case and try to explain in a way such that my little pea brain can understand what might be a better solution.
>>20963 >And there's always the case that I have just formented a huge case of the Dunning-Kruger effect and have no idea what I'm talking about. Lol. There's a somewhat-disreputable-but-probably-reasonably-effective adage that goes 'Fake it till you make it'. Maybe you're on the right track here, but are yet-unexperienced-fully in using the language to spell it out clearly for us plebeians? Regardless, you certainly have the correct sentiments about efficient processing being a (very) high priority for the proper design of good robowaifus. Drive on! :^)
I found something I think could be really fruitful Geometric algebra. I was reading a totally unrelated blog and ran across this comment. "...Geometric Algebra. It’s fantastic, it’s true, it’s representative of how the universe works, it’s relatively (hah!) simple, it’s predictive, and it’s almost completely ignored in scientific education. The behavior of both complex numbers and quaternions emerges GA. Quantum spinors emerge from GA. Maxwell’s 4 equations become a single GA equation describing the relationship between electric charge and magnetism. And all this derives from a single, simple, unifying principle..." What this appears to my limited understanding is a "fairly, easy way to do complex calculations on vectors and many other problems including those of many dimensions. It's been so long since I studied math but I remember taking class on complex numbers and how you could change them to vectors and in consequence multiplying, adding them or other manipulations became very easy. I think this is much the same. You place what you are computing into this vector format and then it becomes fast, low computing power needed to manipulate them. The power of this impressed me as you can take Maxwell's electromagnetic Quanterion math, don't ask, and reduce it to an easier manipulated vector for calculations. Anyways here's a book on, Eduardo Bayro-Corrochano, "Geometric Computing_ for Wavelet Transforms, Robot Vision, Learning, Control and Action" And notice it says "wavelets". I had an intuition that wavelets would be helpful to us. Maybe they are. https://en.wikipedia.org/wiki/Geometric_algebra You can go here http://libgen.rs/ type in "Geometric Algebra" with the nonfiction/sci button selected and find many more books on this. I tried to upload the book I mentioned and it stops at 71%. It's maybe too big. So go to the address I entered above and enter the title I mentioned and you should be able to find the book. It's where I got it from. This address is a great way to find books and scientific articles.
>>22452 This is pretty fascinating Grommet, I think you might be onto something. The point about Maxwell's equations is spot-on. They are in fact a kludge-job. (Please don't misunderstand me, James Clerk Maxwell was a brilliant, impeccable man. A true genius. He simply started from the premise of the reality of 'The Ether', which doesn't exist.) Everything they attempt to describe can be done much more simply and elegantly today. Therefore, since it's correct on that major point, this stuff is probably right about the other things as well. Thanks Anon! Cheers. :^)
>'The Ether', which doesn't exist BLASPHEMY! BLASPHEMY! I must immediately remove myself to the cleansing force of the Inertia Field. :) Did you know that this experiment DID NOT find that the speed of light is the same direction going the direction of earth's orbit as compared to perpendicular to it. It did not. https://en.wikipedia.org/wiki/Michelson%E2%80%93Morley_experiment The textbooks say it did, but it did not. I have read in the univ. library a original copy of the experiment from the Men themselves. In the back it gives the differences. And many, many, many other experiments gave the same results. The most recent one gave a null result, BUT they did it in an underground mine. Cheaters. Maybe they know more about this than they let on. Rid yourself of this silly pseudoscience that there's no ether.
>>22613 From Newton and Math to Ether. What next? Flat Earth? https://youtu.be/JUjZwf9T-cs
>>22662 I don't want to get into too much detail, I can, and maybe I will in the off topic thread(it would take some digging through a lot of sources I believe I still have), but you should not equate what I said to flat earth. There were HUNDREDS of experiments with increasingly accurate equipment testing the Michelson Morley experiment and none of them gave a null equal result of speed of light parallel and perpendicular to the earths movement in space. So I can't prove there's a ether but I can prove that the test they SAY proves there's no ether, and their interpretation of the results they SAY they got, is incorrect. The textbook explanation of this is wrong.
>>22613 Haha, thanks Grommet. You ae henceforth teh /robowaifu/ official court Natural Philosopher. The great Michael Faraday is the chief of your clan. :^)
Found a, new to me, book that covers Geometric Algebra Geometric Algebra Applications Vol. II_ Robot Modelling and Control - Eduardo Bayro-Corrochano(2020) The blurb on this thing sounds like a muticandied wonderland. I'll have to slog through it, not thta I would understad it. Some high points, "...This book presents a unified mathematical treatment of diverse problems in the general domain of robotics and associated fields using Clifford or geometric alge- bra. By addressing a wide spectrum of problems in a common language, it offers both fresh insights and new solutions that are useful to scientists and engineers working in areas related with robotics. It introduces non-specialists to Clifford and geometric algebra..." Unified domain. YEAH. Learn one thing and do it over and over! "...Lie algebra, spinors and versors and the algebra of incidence using the universal geometric algebra generated by reciprocal null cones..." "incidence", "null cones", doesn;t that sound a whole lot like that crazy thing I postualted. Using a set point on a bot body then specifying offsets to move limbs? >>22111 Sounds like it to me(maybe). So maybe here's a way to get the math to work. "...Featuring a detailed study of kinematics, differential kinematics and dynamics using geometric algebra, the book also develops Euler Lagrange and Hamiltoni- ans equations for dynamics using conformal geometric algebra, and the recursive Newton-Euler using screw theory in the motor algebra framework. Further, it comprehensively explores robot modeling and nonlinear controllers, and discusses several applications in computer vision, graphics, neurocomputing, quantum com- puting, robotics and control engineering using the geometric algebra framework..." WOW And he even has a section to make Chobitsu giddy with joy, "...and a entire section focusing on how to write the subroutines in C++... to carry out efficient geometric computations in the geometric algebra framework. Lastly, it shows how program code can be optimized for real-time computations..." I'll try to up load but a link f not. http://library.lol/main/7C2C1AEAA23194B1D55E218BE5EE87E7 Won't upload so you'll need the link. It's 20.6MB
>>23088 >Geometric Algebra Applications Vol. II_ Robot Modelling and Control Neat! Nice title. >And he even has a section to make Chobitsu giddy with joy, LOL. Thanks Anon, I"m giddy. :^) (Actually, it's everyone here that will be 'giddy' in the end, IMHO. C++ is our only practical option that will actually work... but I needlessly repeat myself :^) >Lastly, it shows how program code can be optimized for real-time computations..." Sounds like an amazing book if he delivers. Thanks Grommet! Cheers. :^)
>>23088 If anyone here speaks Spaniard, maybe you can help us track down the software. The book's link given is http: //www.gdl.cinvestav.mx/edb/GAprogramming So, AFAICT it looks like the 'edb' account is no longer a part of this Mexican researach institution (at least on the redirected domain for this link). A cursory search isn't turning anything else for me. Anyone else here care to give it a try? TIA. >=== -prose, sp edit -break hotlink
Edited last time by Chobitsu on 06/12/2023 (Mon) 21:44:14.
>>23095 That link is dead for me
>>23097 >That link is dead for me Yup, thus my request. This link (the new, redirected domain by Mexico gov) is dead: https ://unidad.gdl.cinvestav.mx/edb Even though his CV https://www.ais.uni-bonn.de/BayroShortCVSept2021.pdf (interestingly, located at a German research group) still lists this as his page. Whatever his other (impressive) mathematical accomplishments, he sure makes it hard to find his book's software heh. :^) --- Also, AFAICT the official book publisher's (Springer - Verlag) page doesn't have any software links either. Am I just missing something anons? https: //link.springer.com/book/10.1007/978-3-030-34978-3 --- Here's an entry for his work at the MX institution. Appears to be a grant amount. Again, Spaniard may help out here. https: //www.gob.mx/cms/uploads/attachment/file/458453/CB2017-2018_2ListaComplementaria_Abril2019.pdf p5: A1‐S‐10412 Percepción Aprendizaje y Control de Robot Humanoides Centro de Investigación y de Estudios Avanzados del Instituto Politécnico Nacional EDUARDO JOSE BAYRO CORROCHANO INVESTIGADOR CONTINUACIÓN $1,974,315.95 >=== -add publisher's link -minor edit -add grants link -break hotlinks
Edited last time by Chobitsu on 06/12/2023 (Mon) 21:43:35.
I went to google, in desperation, a last resort, and used their translate. He has a site at the school with his publications listed but...no links to the code. I tried searching for the book + code and all sorts of variations. I'm usually reasonably good at finding things but...a big blank on this code. It's also not on the internet archive. There's a possibility that his code, though not exactly conforming to the book, is in his papers as his book seems to be a summation of his papers. You can find his papers here, http://libgen.rs/scimag/?q=Eduardo+Bayro-Corrochano So whatever code you are looking for match the subject with the paper and maybe the code will be in the paper. Or at the least a mathematical representation of what the code is supposed to do.
More searching and I find a page full of software for Geometric Algebra, not his unfortunately but lots. Even in C++. https://ga-explorer.netlify.app/index.php/ga-software/
And look at the publications page for this. It's all about integrating GA with computing and how to go about it. Interesting blurbs, "...Geometric Algebra (GA) in diverse fields of science and engineering. Consequently, we need better software implementations...For large-scale complex applications having many integrating parts, such as Big Data and Geographical Information Systems, we should expect the need for integrating several GAs to solve a given problem. Even within the context of a single GA space, we often need several interdependent systems of coordinates to efficiently model and solve the problem at hand. Future GA software implementations must take such important issues into account in order to scale, extend, and integrate with existing software systems, in addition to developing new ones, based on the powerful language of GA. This work attempts to provide GA software developers with a self-contained description of an extended framework for performing linear operations on GA multivectors within systems of interdependent coordinate frames of arbitrary metric. The work explains the mathematics and algorithms behind this extended framework and discusses some of its implementation schemes and use cases..." another paper, "...Designing software systems for Geometric Computing applications can be a challenging task. Software engineers typically use software abstractions to hide and manage the high complexity of such systems. Without the presence of a unifying algebraic system to describe geometric models, the use of software abstractions alone can result in many design and maintenance problems. Geometric Algebra (GA) can be a universal abstract algebraic language for software engineering geometric computing applications. Few sources, however, provide enough information about GA-based software implementations targeting the software engineering community. In particular, successfully introducing GA to software engineers requires quite different approaches from introducing GA to mathematicians or physicists. This article provides a high-level introduction to the abstract concepts and algebraic representations behind the elegant GA mathematical structure. ..." https://ga-explorer.netlify.app/index.php/publications/ I'm getting the feeling that using this framework GA you can repeat it over and over. Saving computing resources and making all computing in one big scheme that can be repeated with far less resources. Now this is VERY MUCH like that Rebol programming language that I blathered so much on. One of it's BIG strengths is this unifying character of "series list" and the manipulation of them. It's why Rebol can make all these different functions in the software package and still be a megabyte. I see this sort of thing all over the place. I want to emphasize I'm not a math wiz, or even a fizzle, but I'm ok at recognizing patterns. I see a lot of computing doing this sort of thing. Like Plan 9 operating system and the QNX operating system. They use to great effect the idea of making everything in the code pass messages instead of a mish mash of pointers and other such drivel. A counter to show the difference. Linux is old school, mish mash, so it's a huge hair ball of mass and dreckage, While QNX and Plan 9 are light tidy things. L4 microkernel family does this also. In fact it was a dog at speed until they changed it to pass messages then it took off. I think they use a version of this in F-16's as the OS. Now I also know next to nothing about AI but I do know it's a huge mass of matrix manipulations. And it's very likely, like Maxwell's Quaternion calculations, that GA can whittle it down to size. It may be that the same sort of resource compaction can be done in the case of AI with GA also. Or maybe not. One more link https://hackaday.com/2020/10/06/getting-started-with-geometric-algebra-for-robotics-computer-vision-and-more/
>>23145 There's a library for that called opencv. You can do it from scratch if you want though.
>>23143 >>23144 Thanks Grommet! We'll keep looking from time to time. :^) >>23147 Thanks for the info Anon. OpenCV is pretty amazing IMO.

Report/Delete/Moderation Forms
Delete
Report