>>26341
>since there were folks coming to the fair specifically to see my stuff.
Excellent. That's a great sign Anon. I don't think most anons realize what a yuge sea-change is coming related to robowaifus. The demand is going to be outrageous. Every anon honestly struggling and striving right now to effectively solve every.little.thing. will eventually have an outstanding opportunity to be a first-mover (& shaker) in this new industry I predict will literally become larger than the global automotive transportation industries before all is said and done. Just let
that idea sink in for a minute! :DD
>Building Spud, I better understand and appreciate all the little things the human body does that we take for granted, like compensating the shifts in center of gravity from something simple as raising an arm. So many little things one learns when building stuff.
God's designs of higher life is nothing short of truly mind-boggling, if you simply open your eyes and observe.
So, in the film industry, there's a sub-branch of EFX called animatronics (typically about the same as the concomitant field in the Dark Rides industry). A fairly commonplace tool for that is a so-called
Monkey, which is simply a manually-adjusted motion control device fashioned into a crude approximation of the body form being controlled (much like a
maquette, in the case of a humanoid).
Not much by way of preprogramming numbers, code, &tc., is needed for the technician-artists driving the monkeys. They simply pose-to-pose act out the performance that they want, and the full-sized rig (whether practical or virtual) follows suit. Kind of like an indirect MoCap scenario, so to speak. All it takes is patience and good instincts; writing code &tc., isn't needed at all at this stage of the performance pipeline. Just be a natural good actor with a great sense of balance & pose.
I've been working towards devising an inexpensive, bipedal-capable form for /robowaifu/, et al, and I've already started gravitating towards a monkey-like control system to program in the base movements such as walking, running, jumping, climbing, backflips, &tc. Once the animation data's recorded, then of course it can be further refined & tweaked afterwards.
Eventually, we'll have a collection of software algorithms for 'bio'-kinematics control models, sufficiently accurate for retargeting; with which we can draw from ginormous amounts of professionally-mocap'd performance data to run upon our robowaifu's shells thereafter.
>tl;dr
It's gonna be ebin in the end!! Cheers. :^)
>===
-
prose edit
Edited last time by Chobitsu on 11/12/2023 (Sun) 00:27:36.