/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Downtime was caused by the hosting service's network going down. Should be OK now.

An issue with the Webring addon was causing Lynxchan to intermittently crash. The issue has been fixed.

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB


(used to delete files and postings)

They worked together tirelessly, bouncing ideas off each other and solving problems as a team.

F = ma Robowaifu Technician 12/13/2020 (Sun) 04:24:19 No.7777
Alright mathematicians/physicians report in. Us Plebeians need your honest help to create robowaifus in beginner's terms. How do we make our robowaifus properly dance with us at the Royal Ball? >tl;dr Surely in the end it will be the laws of physic and not mere hyperbole that brings us all real robowaifus in the end. Moar maths kthx.
Adding further to what is needed you are going to need an Operating System. I'm willing to bet that the easiest way to do this is take a micro-kernal type system like L4. There's lots of these but all of them tend to be real time and have been used in the billions of devices. I saw it mentioned that a L4 derivative was used to control F-16 fighters. Here's a link on this OS, https://en.wikipedia.org/wiki/Open_Kernel_Labs It would also seem to me that it would need some sort of message passing system to send commands to different parts or functions. This is like plan9 OS or QNX which have been proved to work very well. QNX has proved that this message passing makes a micro-kernel-based system not lose a great deal of speed. Taking something already used and open source will speed things up. Other links, https://web.archive.org/web/20191024093602/http://l4hq.org/projects/os/ One BIG problem is that they seem to be trying to bury all the code from these micro-kernal L4 projects. Typical for certain operatives to hide useful things. You can look at the links of above but you have to go way back to get them. The sites have been taken over. Here's one where I went far back to get to the site. https://web.archive.org/web/20010405192923/http://www.cse.unsw.edu.au/~disy/L4/index.html
>>20473 >>20481 wow, thanks for that long application. Guess I got my work cut out for me. And it looks like I'll need to do a masters atleast, if not a PhD. I was hoping to finish up my undergrad and then devote most of my time to making robots but it looks much harder than what I thought. I should probably download all those L4 project codes right now before they get erased
>>20473 Nice paper, Anon. No doubt using single bits is much cheaper than using arrays of them. I'd suggest that this style approach to the data will also lend itself rather handily to future implementations directly in 'hard-wired' hardwares (other than just FPGAs). This ofc would be very inexpensive as well as blazingly-fast. >"...various works have explored how to improve their accuracy and how to implement them in low power and resource constrained platforms." I think it's neat that this is a large-scale need, and not one only unique to our robowaifus. Thanks Grommet! That was a rather well-written paper that didn't unnecessarily descend into industry jargon commonplace with many of these researcher's writings. These authors instead seemed to be actually trying to help the relative-uninitiate to understand.
>>20481 >Here's one where I went far back to get to the site. Thanks for the studious efforts Anon.
>>20488 haven't read the actual paper itself, only the links. Its not too hard for a noob right? I can't understand most of the stuff written in these papers so could never read them.
Maybe I'm wrong about the bury part. This would seem to be a very good target to work on, https://sel4.systems/ Looking at this page which goes over the various L4 OS implementations. https://en.wikipedia.org//wiki/L4_microkernel_family This one has been fully verified as not having any security problems, locks, glitches. So if this could be the base then it would likely not have any surprises in it.
>>20491 I've been checking out the wiki and it says L4 is closed source. There's a variant, OKL4 which is open source. Is that inferior to the original L4? And it seems to be compatible with MIPS, ARM and x86, but no word on RISCV.
>>20484 >And it looks like I'll need to do a masters atleast, if not a PhD I'm really sorry about that. I know it's a big pain in the ass but just looking at this stuff it appears to me that this stuff will be necessary to actually have something that will walk around, recognize you and maybe understand commands. It's a real pain to look at all this but I don't know any short cuts. It's depressing. I'm fully capable of being entirely wrong. Nothing would make me happier than to be. But it seem if you want to properly have something that can expand over time you need an OS or what will happen is you will have a big binary pile of stuff that will never ever, ever be ported but to one specific thing and then immediately bit rot will set in and it will become useless. If however an OS is used, and some of the L4 OS's are based on C and are portable to many processors, then once you get the base then you are able to port it to other processors. Speeding up the waifu and adding capabilities at the speed of processor advancement. I'm willing to bet thta if you cobble somethng together that runs a waifu that by the time you get all this stuff going you will have written an OS worth of stuff to make it work that you could have gotten off the net in the first place. Better to start with something they have already worked on. They are using L4 for missile systems, Jet fighters, medical equipment. It's first class stuff. The bad part is you have learn to use this stuff and of course, none of this is easy. Hence my constant hem hawing and pleading that I'm not sure I understand this or that, because, it's hard.
>>20497 The code part isn't the hard part. I can learn it myself, but robotics is a practical field. Sooner or later, I'll have to get my hands dirty and its best done under supervision of someone who knows what they're doing.
seL4 has code online. The OKL4 was bought out by General Dynamics. They are using it in missiles and stuff. If I was forced to say, pick one, I would say use this seL4. They , I think DARPA, went through every line to make sure that the code would not give anyone any surprises and that it would work. Here's the link for the code. https://github.com/seL4/seL4 I repeat myself but the original code was L3 then L4 made a big breakthrough . The guy who did it made small hypervisor code work MUCH faster. Like 20 times faster. So work exploded. Here's a link on the evolution of these various codes based on his original L4 work. The work made message passing faster. https://en.wikipedia.org//wiki/L4_microkernel_family The advantages of micro-kernels are VERY high in systems we do not want to fail. Because the basic system that runs everything is small and can not be screwed with easily it is always solid and works. Other programs may fail and not take the system down. Very important. Until that advancement in message passing they were very slow. Jochen Liedtke did the break through to make speed not a problem and others improved on it.
From this page https://docs.sel4.systems/projects/sel4/frequently-asked-questions "...To the best of our knowledge, seL4 is the world’s fastest microkernel on the supported processors...64-bit RISC-V kernel is about 9,400 SLOC...Presently seL4 runs on Arm v6, v7 (32-bit) and v8 (64-bit) cores, on PC99 (x86) cores (32- and 64-bit mode), and RISC-V RV64 (64-bit) cores..." NICE! I wonder if t will run on ESP32 micro-controllers. If so it would be SO IDEAL. Even of it doesn't you could use serial comm to talk to them.
This is great stuff Grommet. Thanks for all the research. If you feel strongly about this as a OS platform, then I'd think it's in our collective best interests here to investigate it thoroughly in a practical way?
>>20481 >One BIG problem is that they seem to be trying to bury all the code from these micro-kernal L4 project Ok I'm wrong about this. I looked at all this L4 stuff a few years ago. Several years ago and when I looked recently a bunch of it seems to be gone but the seL4 is current. The guy who came up with the code that made L4 possible died. (Likely another one of those "died suddenly" that we see so much of) It may very well be that a lot of his projects died with him and the sites went down. I've looked at the L4 stuff for along time but not recently. I want to add if you are interested in robowaifus you're going to need micro-controllers for input and output. There's this one called ESP32 that is really the swiss army knife of micro-controllers. here's some comments I made on them >>12474 >>18902 >>18778 >>12480 Here I do same math on cost to build with these micro-controllers. >>13408 BTW here's a paid OS for ESP32 microprocessors. https://mongoose-os.com/mos.html
>>20507 I'll check them out. But I'd rather some other anon specializes in microcontrollers and OSes. I'm already deep into the AI part and I'm not sure if I have enough time to spare to learn something completely new. If we are to make proper robowaifus, we need different specializations working together, instead of everyone becoming jack of all trades, master of none. btw we'll be solely using RISCV in our robowaifu microcontrollers right? I wouldn't trust closed source ARM and x86 and iirc the MIPS creators have moved on to support RISCV.
>>20506 >investigate it thoroughly in a practical way? Part if this is I read a lot of stuff because this sort of thing interest me. It may very well be that there are big problems that that are not readily apparent on the surface. I'm trying to point people to stuff I've seen that "seem" to work but there's no doubt that I could miss a lot of others that could be better. This seL4 looks really good though. It has definitely been used for major systems like missiles and planes and stuff like that and the source is available. That being said none if this stuff is really easy. The Boston Dynamics people worked on this stuff for many years. Fake dog and fake oxen. I suspect that these guys, BD, coded everything into a big wad of code with all this very specific motion code. I can't say I know for sure but I "think" that if we we were to make some basic movement type code, say a rough outline of movement, and then run AI so that it learns to walk, I bet it would be faster and less computational dense. A lot faster and cheaper. People here have said that has been done and it didn't work. Maybe if it could watch itself and then have it reference an actual human walking to correct itself as it learned???? Not easy. If you can build a waifu, you could also build a exoskeleton and that could be used to program the waifu.
>>20510 does BD ever plan to implement some kind of AI or other adaptabilitycode in their robots? Otherwise, their Spot robots that they plan on selling to the police will not take off. Nor will their other robots, I forgot the name. Scripted obstacle courses and dances can only take you so far. I'm so frustrated with BD, they're the only real competition to Tesla's Optimus. Unless Honda brings out ASIMO 2.
Here's using a ESP32 for face recognition. I don't know if the code is AI or not. https://randomnerdtutorials.com/esp32-cam-video-streaming-face-recognition-arduino-ide/ Enough, I'll stop filling up comments now.
>>20512 >BD I have no real knowledge of what they are doing but look at their stuff. It looks like they programmed in all this motion stuff with physics and all of that. Or to me it does. I think that path is a dead end. But what do I know I'm just some guy on the internet.
>>20513 >Enough, I'll stop filling up comments now. No, don't stop lol. Everything you've ever posted here has been useful information Grommet, some very-much so. :^)
>>20514 exactly what I said. They've already hit a dead-end imo. How many different real-world scenarios can they hardcode into their robots? They really better start investing in AI. Hopefully, poach some talent from Tesla guys working on Optimus. While the Optimus robot generally felt like a sore disappointment, I thought the AI and vision part was pretty good. I'd like to see it work in BD robots.
>>20512 >I'm so frustrated with BD I mean I think I understand your postition, Anon. But frankly, I see any fumbles by the big players as feeding into our own success here on /robowaifu/ and our related-cadres out there. More 'breathing room', as it were.
>>20525 I don't particularly care who gets to wroking humanoid robots first. I see all advances as a win. Besides, even their wins would eventually trickle down to DIY anons. I'd definitely buy a BD robot, take it apart to see how they made it, then make my own.
>>20524 >While the Optimus robot generally felt like a sore disappointment I predict Tesla will soon shoot far past any other player is this arena anywhere in the midterm timeframe. We can be thankful they aren't targeting the robowaifu market! :^) >I thought the AI and vision part was pretty good. It is interesting. But not too surprising IMO, after all, it's literally the same AI board they use in their auto-driving cars. >=== -minor edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 10:55:09.
>>20526 Ah, I see. Well, I understand the perspective at least. However I personally feel morals impinging around this entire domain that make it vital, I feel, that we get robowaifus out there as a real cottage industry well-before the big players invade that niche and manipulate the systems to make anything but their own globohomo-approved mechanical c*nts illegal. History shows us all time and time again their slimeball tactics in this manner. >=== -prose edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 10:54:01.
>>20527 Honestly, as much as I hate Elon, I think if anyone's going to start selling robowaifus and disregard the mainstream outrage, it's Musk. >>20528 Thing is, it's much easier for the world governments to start banning robowaifus when it's just a small DIY scene. But, when you got megacorps who would lobby billions, its much harder to ban. The more robowaifus proliferate, both in the DIY scene and in large corporations, the harder it will be for them to ban. And I don't think any megacorp will actually advocate banning robowaifus. Their only ideology is their bottom line and robowaifus would potentially a trillion dollar industry.
>>20529 >I think if anyone's going to start selling robowaifus and disregard the mainstream outrage, it's Musk. I too see him creating new companion lines once we here succeed at it first. He clearly is targeting his own factories first and foremost, and then specialized labor uses. Then he'll likely move into medical/patient care. During all that time, we'll be working on perfecting robowaifus, ofc! :^) >>20529 I disagree on both points. Since we are open-saucing everything here, it's a genie out of the bottle. Since the East will go banging with this quickly, the Globohomo will never be able to stop it. Secondly, I feel you err in your estimate that somehow the 'megacorps' and the government are two distinct entities. They haven't been for a long time. That's why their full, formal title is The Globohomo Big-Technology/Government. And it's also the odd state of affairs that the tech-tail is wagging the beltway-dog, very clearly. >=== -minor edit
Edited last time by Chobitsu on 02/21/2023 (Tue) 11:08:18.
>>20530 I do think they're working parallelly on adapting Optimus to waifu duties and they'll release it a few years after the factory robots. And I do not believe the Globohomo to be one huge monolith with a single goal. There are different factions with competing interest within them, hence why you can find governments, companies often coming into conflict. Among them, I believe companies to be more shortsighted and focused on profits than some world government/global control scheme. They're also the ones who fill the pockets of politicians. They can definitely see the potential profits in offering a robowaifu, especially in this day and age with billions of lonely men.
>>20531 >I do think they're working parallelly on adapting Optimus to waifu duties NIce. But they'll have to create something on an entirely-different frame geometry (which will require re-optimizing all the articulations & actuation codes). Fair enough about your estimates on the Globohomo. I might have time & energy to debate this topic later. Speaking of my energy, we're well off-topic ITT. Any further discussion please move it to somewhere else like /meta or news. It's a tedious, error-prone process copy-pasting each post one-by-one, with a new post each over into it's proper thread, by hand, and then deleting all the originals; but sadly that's exactly what I have to do each time I 'move' a conversation over to another thread. I'd like to 'cut the saying short' if you take my meaning! :^) >=== -prose edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 05:04:28.
I also should mention the micro-controllers will need an OS to lesson the burden of writing all this super complicated timing stuff in OS's. Really hairy stuff. A good open source that works a huge amount of micro-controllers, including my favorite the ESP32 is FreeRTOS. Real time so it will be responsive. It has good expansion features for legality. How long before they add in safety rules and regs? Not long. It MIT license so you free to do as you please but it has, if you pay, verified guarantees. There seems to be a good deal of documents for it and other libraries to use. Code for RISCV and ARM and others. This means we can write code and use whatever micro-controller w can get at the lest cost highest performance. https://www.freertos.org/index.html I don't think this will work for microcomputer processors but I'm not sure. Be nice if it did then we could use more powerful RISCV for processing speech, visual, etc. while using the same OS everywhere. Less to learn. I'm not saying I know how to do this but my thinking is if we could use all these micro-controllers for input output to mostly walk and move around but ALSO use the fairly large computing power they have built in to do processing also. So say the robowaifu wants to clean something or do something complicated it could stop moving mostly and use a little of it's micro-controller distributed computing power to aid in whatever task it was concentrating on. Much like humans. When they concentrate they slow down and think. I think as I said before something like seL4 will likely have to be used for the main processor for speech, understanding and navigation. There's something I talked about earlier that I think is real important. In order to keep cost low we will have to use some sort of off the shelf micro-controller. The ESP32 I like so much has a large amount of inputs (they can read capacitive sensors for touch) and a large amount of outputs. Ideal. Now instead of building some contrived output board instead we use these outputs built in (driving transistors or more likely MOSFETs) AND the big win is these things have a a lot of computing power. So we have all this computing power and if we can share this between the various micro-controllers then it may very well be that for basic moving around, not bumping into things we could just use the built in computing power of these things and not have any main processor at all. Later for speech or higher functions we could add a fast RISCV microprocessor and link to the other controllers. What kind of power we talking about. I wrote before, "...You can get them for less than $9USD... So at 300 muscles/16 PWM output channels per micro-controller, means we need 19 and at $9 each=$171 But with that comes 19 x 600 =11,400 DMIPS. DMIPS is basically that many integer million instructions per second. It's a lot. 11.4 billion total per second with 19 MC's. >>12474 Let's say we check every output and input every micro second so a 1000 times a second and it takes 10 instruction cycles to do so that leaves us 599,560,000 instructions a second to do...something with. And that's just one processor. Most things we are going to do is compare and test, is the arm here or there, has the touch sensor touched anything, etc. most of these are values you compare to some value. I don't think the average computing will be very large for that. Even if it's ten or hundred times larger we still have a hell of a lot of computing left over. I think with the right algorithms walking and moving about will take very little computing power. After all insects move about just fine and they have next to no computing power. Of course figuring out how to do this might be tricky. I bet someone, somewhere, has a paper about this but I'm not sure if I've seen it. Looking up insect movement papers might be helpful. "...Insect brains start at about 1000 neurons...".." It's a lot of power these MC have and communication is built in to these things with CAN2.0 bus comm. like they use in cars, industrial machinery and medical equipment. Now none of this is easy. Learning some operating system, writing code for micro-controllers, using CANbus commands, but think of you had to write this stuff yourself from scratch. Hopeless. Almost a lifetime job, but if you can crib through a manual and copy other people's snippets of code, cut and paste code from these specs that are used by lots and lots of people, (meaning more code in the open), maybe you could get something working. If anyone has any ideas on how to send messages over CAN2 bus for movement of actuators I would really like to hear it. You would have to send some sort of vector, meaning a direction, in or out, and a velocity which would be turned into a voltage to drive the actuator. Now you want to be able change this in the middle of movement so how to do that? Could you send a beginning vector and then send maybe a second vector? How to make these vectors coordinated so you have walking instead of jerky stuttering movement? Could you have one micro-controller control all the muscles in one limb so that you could tell the limb to move say, forward ten inches and two inches to the side and then have the computer figure out how to work these all together? Could you use some sort of AI software to do all this coordination? Coding it all by hand could take forever. What kind of AI code would you use to do this sort of thing? Lots of questions, no answers...yet.
>>20558 This is absolutely great stuff, Grommet. Thanks! >Let's say we check every output and input every micro second so a 1000 times a second I'm guessing you mean milli-second? >If anyone has any ideas on how to send messages over CAN2 bus for movement of actuators I would really like to hear it. There's RobowaifuDev's IPCNet (>>2418). Also, you can read this (>>772). My apologies I can't give the response ATM that your post deserves, Anon. Cheers. :^) >=== -minor edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 04:58:59.
I hate to keep changing things but I haven't looked at real time operating systems for micro-controllers in a long time and things have really changed. I found a real interesting OS called Zephyr. https://en.wikipedia.org/wiki/Zephyr_(operating_system) This looks good. It is run by the Linux foundation and is maintained. It also will run on larger faster computer processors so that's a big plus. You could run MC and your main high power processors on the same OS. Big win. Save you from having to learn more than one system. Huge advantage. Here's a comparison of some well known, I think, OS's for micro-controllers. https://micro.ros.org/docs/concepts/rtos/comparison/ I'm still looking at these to see what might be best. It's a big deal to pick one because of all the effort you will put into learning how to use it. Once there, you're not likely to change so having the right mix of ease of use and others having code you can reuse is important. While I was looking at the OS comparison link I went up to the main link and found there is such a thing as a "Robot Operating System" which uses the lower level OS's listed as a base and rides on top of it. Holy smokes there's a lot of books and data on this thing. It's also free. I'm asking all these questions about how to do all this stuff about coordination, maybe it's already done??? https://micro.ros.org/ Here's a link to a search for a LOT of books on this Robot Operating System http://libgen.rs/search.php?req=Robot+Operating+System&open=0&res=25&view=simple&phrase=1&column=def Have to look further and see what this is all about. My first impression is this is the shit. WOW! This is what's needed(if it works). There's tons of books, documents and the books say you can simulate robots. How cool is that. Get the right actuator then simulate all the movements and software before you build. You could make progress with zero money just simulation. BUT does it really work? It's very impressive marketing blurbs, but you know how these things are. I've got an idea for using robots for some dangerous work but stuff you could do yourself or two guys and maybe make some money. It wouldn't be a waifu, but all the techniques to build the equipment and operation would use all the same tools and provide, maybe , some cash. This really excites me. I may be buried in some books for the next few months.
>1000 times a second I'm guessing you mean milli-second? Oops...yes
>Car Hacker's Handbook Much thanks. I need that thread. I'm looking at robots AND drones right now. I've got an idea that I can use drones to carry ropes to the tops of trees. Attach cables, crawl or pull robots tree cutters and then trim or cut down trees. There's some money in this and my Mom has a tree that really has to have it's limbs cut. I've been debating how to trim this down. I bought a tree harness and have some ropes but I'm terrified of climbing this really tall tree. And let's not even talk about usng a chain saw while up in the tree. And I know how to use one but...it's like 75 foot high. The costs are really high to get someone to trim these and this seems the right time to try something different. It's going to fall on the house. So I'll get some tree cutting bids and see just what it will take and "try" to build some sort of tree trimming robot. In the process I'll learn all the skills needed for waifus, save my Mom's house and her some bucks. My thinking is get away from chain saws. Build something like huge garden shears like the beak of a parrot with inward cutting blades. I've talked a lot about transmissions. I think I can build one of those I talked about. Slowly close the jaws and snip trees right in two. If too big nip at it a little at a time. Good selling point if they have power then no machine noise or much less with electric. The job is dangerous as can be I know someone who used to do it. A chainsaw will tear a huge chunk in you in seconds if you let it hit you. I saw a guy hit his leg one time. DAMN the blood. It went right in his leg. There's also home owners fear of the guy killing himself and suing him so knowing you will be on the ground and using robots. I think a big selling point That's my plan over the next many months. Likely, or the plan now, is to use this kick ass sounding Robot Operating System, ESP32 and try to make it work. I can weld and have a decent knowledge of metal casting aluminium (but haven't done it), so I expect I can build whatever need and what I can't, maybe bearings, are easy to get. I have lots of reluctance motor ideas. I'll just have to build some and see what works. A PLAN!
>>20497 >over time you need an OS or what will happen is you will have a big binary pile of stuff that will never ever, ever be ported but to one specific thing and then immediately bit rot will set in and it will become useless. I think bit rot can be mitigated by something like btrfs or zfs. Also, I maybe don't understand, but if something runs on a known system you can always emulate that. Anyways, thanks for the reminder that maybe we should use some RTOS for movements at least. Ideal would be to have code that can be used on a more common system and then transferred to such a system as soon as necessary. It's likely that people will implement things in Arduino C++ or Micropython these days. >but I'm terrified of climbing this really tall tree. I did this as a child for fun. You have four limbs, it's unlikely that all of them fail or slip at the same time. Also, the branches below you would catch you while falling. Well, being slender, fit and young would help. A belt on top of it should make it very safe. >chainsaw ... I saw a guy hit his leg one time. Yeah, please don't do that.
>>20575 It's a great business idea Anon. I hope you can successfully pull this off as a plan. Maybe we should consider starting a non-robowaifu projects prototyping thread here? Certainly this tree surgeon robot would be much simpler than a robowaifu will be, yet touch on many of the similarly-needed design & engineering concepts. >The job is dangerous as can be... Yeah, please do not do this yourself Grommet. You're too valuable to our group here! >>20582 >Ideal would be to have code that can be used on a more common system and then transferred to such a system as soon as necessary. Agreed. >=== -minor edit
Edited last time by Chobitsu on 02/22/2023 (Wed) 11:14:42.
>>20582 >I think bit rot can be mitigated by something like btrfs or zfs Maybe I'm using the wrong terminology. You misunderstand. I'm not trying to rude just more precise. It's going to take a while to get this stuff to work. Each micro-controller will have it's own glitches and exceptions. If you code for just one then very soon the latest and greatest will come out and you will be covered up in weird bugs. "If" we can use these robot operating systems the companies or the software providers will show you which micro-controllers work with the software and you can port to it easily. You're calling finctions ionstead of raw registers and assembly coding. I have very little computer programming experience. Only FORTRAN and hexadecimal assembly. Hex programming is hard and takes forever. I could do it but the time is immense. The assumption I'm making s these libraries can do a lot of the little stuff whoe we concentrate of the larger movements of the robot. All this stuff related to timing and a whole bunch of OS type housekeeping, I don't think I can do that or at least not in this lifetime. I've been looking a a little at the Robot OS and it's hard enough. The link I gave above is microROS, there's a larger Robot Operating System with way more features. It runs on Linux. So we may need a single board computer with Linux, then micro-controllers to operated the limbs, sensors etc. Networked together like nerves in a human. Let's hope the microROS, for micro-controllers, and the larger ROS are similar. I get the impression they are. I can already see this wll be harder than origninally thought. I think you will almost have to learn C to do this. I've been downloading books to do this. I have made a couple of C programs and I really hate this stuff. Programming is something that, to me, is satisfying "after" you finish but a real pain in the ass while you are doing it. As for climbing trees. When I was young I climbed plenty but never with a chain saw running nor am I young any more.
>>20629 Your writing was a bit verbose before and I got confused trying to follow the discussion here. I can't always parse the whole comment or loose the train of thought about the conversation. I think you actually made the case for using an OS. Which I support, without claiming to know much about it, if there's no strong reason against it. Otherwise, someone has to make the case why not: OS or drivers from the scratch, down to addressing the SBC? Just to make it a bit more efficient? Maybe I'm wrong but this looks like overkill or at least premature optimization. There should be some abstraction layer running on many SBCs, then we'll only need the write the code on top of it. What's again wrong with some L4 OS? Who's arguing against it? Do we even need it? Is it even necessary to have this amount of fail safety? The real world is messy, we should make our robots being able to sense and adapt to it. Not move super fast and ultra precise. >I think you will almost have to learn C to do this. Maybe something like NIM that compiles to C is good enough? I looked briefly into L4 and it seems to have it's own programming language or maybe it's just a coding style?
>>20877 >nesper >I generally dislike programming C/C++ (despite C's elegance in the small). When you just want a hash table in C it's tedious and error prone. C++ is about 5 different languages and I have to idea how to use half of them anymore. Rust doesn't work on half of the boards I want to program. MicroPython? ... Nope - I need speed and efficiency. Thanks, this might become handy. But why are we in the math thread?
>>20882 >But why are we in the math thread? My apologies to everyone here for contributing to this abuse big-time. NoidoDev, would you be willing to consider becoming involved directly in taking on the tedious tasks (>>20532) involved in cleaning our threads up with posts in their proper places? (Again, apologies for further compounding the problem with this post.)
>>20884 >would you be willing to consider becoming involved ... in cleaning our threads up with posts in their proper places? How? Trying to register as vol?
>>20901 Yes. You just have to get Robi to turn on account creation, then once you have one made, I'll assign that account as a volunteer here on /robowaifu/. Trust me, this isn't fun work, so you should give some consideration if you really want to take it on first.
Just so I can say I was not "totally" off topic. A lot of control theory, I believe, is loosely, very loosely, based on the same sort of ideas as Fourier transforms. Sine waves. I'm not a mathematician but it seems that most are based on wave forms and the math is very slow because it uses sine waves, generally. There's a set of math functions that are based on stretching a raising the peaks of waves, "Wavelets" that is far, far, faster. Djvu uses wavelets, a lot of oil prospecting seismic processing software use wavelets to tease out very fine grained underground structures from the data and movies use these functions to compress data. I've read the processing time for signal processing can be 10 times less using wavelets to analyze data, statistics, etc. It seems that using sine waves based signal processing uses far more processing steps. More computing time. Wavelets use much more of a simple add and subtract without a lot of matrix algebra. I can't help but think it may be analogous to the mass of matrix additions that AI uses now compared to the way Xnor.Ai processes AI far faster. I'm trying to grasp the big idea pattern here. It's seems that present AI (I'm going to equate to a waveform) uses a lot of matrix multiplications to go over every single speck of the data. Analyzing each and every little data point. Xnor.Ai uses a far larger gross examinations by saying, is this one clump of data I'm analyzing larger than that clump of data, and then passing the result on as yes or no. They only care about the larger coefficients when they are analyzing it. I see this as comparable to wavelet processing in a general sort of big picture way. I'm probably screwing this up but I hope that I've pointed things in a generally correct direction. https://en.wikipedia.org/wiki/Wavelet Another offshoot of this idea is a "chirplet". There's a GREAT picture of the different waves that gives you a BIG picture idea of what I'm trying, probably unsuccessfully, to convey at this link. I'll link the picture too. https://en.wikipedia.org/wiki/Chirplet_transform https://upload.wikimedia.org/wikipedia/commons/1/10/Wave-chirp-wavelet-chirplet.png Look at how the different waves could be used to represent information or analyze information. Here's my understanding of why this is a good thing. Look at first "the "wave". Think if you had to add a lot of these up like a Fourier transform it would take a lot of them to fit it into the signal we are approximating. I think the general idea is the same as successive approximation in calculus. So we add up all these waves to make it fit our actual data. Now look at the wavelet. It could stretch and raise the peaks to fit. I think this function uses less coefficients to fit into the signal, now look at the chirplet. since it it seems to already have a bit of stretch built into the function it might take even less stretching and raising of the amplitude to approximate the information waveform. I think the basic idea is that you transform the signal of whatever we are trying to analyze into a convenient waveform, wavelet, chirplet, etc. then we can use simple addition and subtraction to quickly analyze the data to tease out what is going on in this, formally, complex wave of data. This vastly simplifies the processing power needed. Now me saying this like it's some simple thing, well it's not. Figuring out "what" transform to use and how to set it up is difficult. Maybe what needs to be done is to figure out what method, transform, operation would be most advantageous for us to use. What I'm trying to do is to state what the problem "is" and how to go about solving it, not that I necessarily know the answer. And there's always the case that I have just formented a huge case of the Dunning-Kruger effect and have no idea what I'm talking about. If so please inform me this is the case and try to explain in a way such that my little pea brain can understand what might be a better solution.
>>20963 >And there's always the case that I have just formented a huge case of the Dunning-Kruger effect and have no idea what I'm talking about. Lol. There's a somewhat-disreputable-but-probably-reasonably-effective adage that goes 'Fake it till you make it'. Maybe you're on the right track here, but are yet-unexperienced-fully in using the language to spell it out clearly for us plebeians? Regardless, you certainly have the correct sentiments about efficient processing being a (very) high priority for the proper design of good robowaifus. Drive on! :^)
I found something I think could be really fruitful Geometric algebra. I was reading a totally unrelated blog and ran across this comment. "...Geometric Algebra. It’s fantastic, it’s true, it’s representative of how the universe works, it’s relatively (hah!) simple, it’s predictive, and it’s almost completely ignored in scientific education. The behavior of both complex numbers and quaternions emerges GA. Quantum spinors emerge from GA. Maxwell’s 4 equations become a single GA equation describing the relationship between electric charge and magnetism. And all this derives from a single, simple, unifying principle..." What this appears to my limited understanding is a "fairly, easy way to do complex calculations on vectors and many other problems including those of many dimensions. It's been so long since I studied math but I remember taking class on complex numbers and how you could change them to vectors and in consequence multiplying, adding them or other manipulations became very easy. I think this is much the same. You place what you are computing into this vector format and then it becomes fast, low computing power needed to manipulate them. The power of this impressed me as you can take Maxwell's electromagnetic Quanterion math, don't ask, and reduce it to an easier manipulated vector for calculations. Anyways here's a book on, Eduardo Bayro-Corrochano, "Geometric Computing_ for Wavelet Transforms, Robot Vision, Learning, Control and Action" And notice it says "wavelets". I had an intuition that wavelets would be helpful to us. Maybe they are. https://en.wikipedia.org/wiki/Geometric_algebra You can go here http://libgen.rs/ type in "Geometric Algebra" with the nonfiction/sci button selected and find many more books on this. I tried to upload the book I mentioned and it stops at 71%. It's maybe too big. So go to the address I entered above and enter the title I mentioned and you should be able to find the book. It's where I got it from. This address is a great way to find books and scientific articles.
>>22452 This is pretty fascinating Grommet, I think you might be onto something. The point about Maxwell's equations is spot-on. They are in fact a kludge-job. (Please don't misunderstand me, James Clerk Maxwell was a brilliant, impeccable man. A true genius. He simply started from the premise of the reality of 'The Ether', which doesn't exist.) Everything they attempt to describe can be done much more simply and elegantly today. Therefore, since it's correct on that major point, this stuff is probably right about the other things as well. Thanks Anon! Cheers. :^)
>'The Ether', which doesn't exist BLASPHEMY! BLASPHEMY! I must immediately remove myself to the cleansing force of the Inertia Field. :) Did you know that this experiment DID NOT find that the speed of light is the same direction going the direction of earth's orbit as compared to perpendicular to it. It did not. https://en.wikipedia.org/wiki/Michelson%E2%80%93Morley_experiment The textbooks say it did, but it did not. I have read in the univ. library a original copy of the experiment from the Men themselves. In the back it gives the differences. And many, many, many other experiments gave the same results. The most recent one gave a null result, BUT they did it in an underground mine. Cheaters. Maybe they know more about this than they let on. Rid yourself of this silly pseudoscience that there's no ether.
>>22613 From Newton and Math to Ether. What next? Flat Earth? https://youtu.be/JUjZwf9T-cs
>>22662 I don't want to get into too much detail, I can, and maybe I will in the off topic thread(it would take some digging through a lot of sources I believe I still have), but you should not equate what I said to flat earth. There were HUNDREDS of experiments with increasingly accurate equipment testing the Michelson Morley experiment and none of them gave a null equal result of speed of light parallel and perpendicular to the earths movement in space. So I can't prove there's a ether but I can prove that the test they SAY proves there's no ether, and their interpretation of the results they SAY they got, is incorrect. The textbook explanation of this is wrong.
>>22613 Haha, thanks Grommet. You ae henceforth teh /robowaifu/ official court Natural Philosopher. The great Michael Faraday is the chief of your clan. :^)

Report/Delete/Moderation Forms