/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

I Fucked Up

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


“If you are going through hell, keep going.” -t. Winston Churchill


/robowaifu/meta-10: Building our path to the good end. Greentext anon 08/12/2024 (Mon) 05:24:31 No.32767
/meta, offtopic, & QTDDTOT <--- Mini-FAQ A few hand-picked posts on various /robowaifu/-related topics: --->Lurk less: Tasks to Tackle ( >>20037 ) --->Why is keeping mass (weight) low so important? ( >>4313 ) --->How to get started with AI/ML for beginners ( >>18306 ) --->"The Big 4" things we need to solve here ( >>15182 ) --->HOW TO SOLVE IT ( >>4143 ) --->Why we exist on an imageboard, and not some other forum platform ( >>15638, >>31158 ) --->This is madness! You can't possibly succeed, so why even bother? ( >>20208, >>23969 ) --->All AI programming is done in Python. So why are you using C & C++ here? ( >>21057, >>21091, >>27167, >>29994 ) --->How to learn to program in C++ for robowaifus? ( >>18749, >>19777 ) --->How to bulk-download AI models from huggingface.co ? ( >>25962, >>25986 ) --->Why do you talk about feminism here? How are robowaifus related? ( >>27124, >>1061 ) --->Why should/shouldn't I do this; what's in it for me? ( >>33755 ) <--- -Library thread (good for locating terms/topics) ( >>7143 ) --> note: There's a simple searching tool coded right here for /robowaifu/ that provides crosslinks straight to posts on the board. It's named Waifusearch, and the link to the latest code should always be maintained within the Library thread's OP & also on the current /meta. ---> Latest version of Waifusearch v0.2a ( >>8678 ) ---> Latest version of /robowaifu/ JSON archives v221213 Dec 2022 https://files.catbox.moe/6rhjl8.7z if you use Waifusearch, just extract these JSON files into your 'all_jsons' directory for the program, then quit (q) and restart. --->note: There's an archiving tool coded right here for /robowaifu/ that provides the ability to backup locally the posts & files from our board. It's named BUMP, and is basically a custom IB scraper. Latest version of BUMP v0.2g ( >>14866 ) <--- >note: There's a design document for the specification of general design and engineering choices for our basic Robowaifu Reference Model A Series (TBD). Please have a look at it, and collaborate together with us on it ITT Anon. -Robowaifu Design Document ( >>3001 ) <--- >useful external resources wAIfu-collective's AI guide - https://rentry.org/waifu-diy-ai <--- Previous meta threads: ( >>38 ) ( >>3108 ) ( >>8492 ) ( >>12974 ) ( >>15434 ) ( >>18173 ) ( >>20356 ) ( >>23415 ) ( >>26137 ) >=== -minor fmt patch -FAQ edit
Edited last time by Chobitsu on 12/20/2024 (Fri) 20:07:14.
>>>/meta/3 The >tl;dr is, AFAICT : 1. There was a billing mixup for Robi with the site backend provider. 2. Being full-on-kikes, they immediately baleeted everything as soon as the bill came overdue. > Fair warning: Don't ever trust the Globohomo with your robowaifu's data!! 3. Robi only had backups of the text messages, not the files (too big a collection). He did still have files from back in September or so, however. 4. That means any more-recent files are gone. On the (potential) upside, if anyone here will (re)post those files on, say, /meta/ , that will make them reappear again everywhere here on alogs.space (including /robowaifu/ )... 5. Which brings me to my screwup...I've been distracted by Uni & finals. I had a machine issue with the linux box that was doing BUMP backups of /robowaifu/ . Due to pressures in my current environment, like a fool I never made the time to fix the issue. My last backup was probably 2 months ago -- I'm not sure yet. 6. Which further leads me to my hope that any one of you Anons actually took my advice to have more than one of us making regular BUMP archives of the board. IF NOT... then I'm afraid we've lost at least a couple month's files -- presuming they will not be found anywhere else. >tl;dr Most disasters involving complex systems are a comedy of smol errors, that cumulatively leads to a big failure (cf. >>4631 ). Again, fair warning. <---> Regardless, I won't even really be able to look into my latest archive of the board for a few more days until Winter break. In the meantime I'm hoping one of you will come through for everyone as the hero we all need! Regards, Anons.
Edited last time by Chobitsu on 12/18/2024 (Wed) 04:24:44.
Open file (142.03 KB 1600x1200 D4VxGNEUIAA1d1L.jpg)
>>34962 No backups, unfortunately. What I do have is twenty-eight threads loaded and cached in my browser. If it's needed, then I can rip every thumbnail and upload them in a tar as a last resort.
>>34965 Hmm, that's an interesting idea, Greentext anon. Please review this with Robi at >>>/meta/3 . I'm unsure how that process would even work. But yes, IMO having even the thumbnails would be of benefit, Anon. Please keep that browser open! :^)
>>34965 Also, can you run a little experiment for me & re-upload the 3 pics that were the OP ITT, Greentext anon? I'm pretty sure they will re-integrate with Lynxchan's system properly again thereafter (even if you delete the dump post).
>>34965 >>34966 *Addendum, I have some of those images loaded. I just checked through my tabs, and bunch of them have unloaded, though I'm not sure why. There's no apparent rhyme or reason to it. Like on this thread, everything before December is gone, except one of the header images (not that those are an issue, I have full quality copies of those). Or the Galatea thread, where I have everything except a handful of thumbnails in the middle of the thread. My browser might not have loaded everything to save space after I restarted a few days ago. For the time being, I'll hold off on running the next round of updates until this is resolved. On a reltated note, do you know how to pull thumbnails? The browser doesn't seem to acknowledge that they exist, and any attempt at saving just results in a broken file.
>>34968 Of course. I still have full quality images of everything I uploaded here. As for anyone else's images, it's a crapshoot, but I can do some searches on my drive.
>>34969 Browsers """save energy""" by unloading resources from tabs today. This could be what's happening. I've learned that I can save a page (if it's loaded properly into memory) using Brave even if the original has been deleted. Maybe this will work to save out a copy on some of them? >>34970 Hmm... I'm not seeing them 're-integrate' properly, Anon. Robi mentioned writing some kind of tool to help. Maybe things have changed on the alogs/Lynxchan backend that keep this from 'automagically' working now? Hmm. I'd say just organize missing files and wait for Robi's signal as per >>>/meta/3 . Thanks Anon! You're awesome. :^)
>>34971 Downloading the page doesn't seem to work, it's like the thumbnails aren't even there. There must be something fucky going on behind the scenes in Waterfox's code. At any rate, this lit the fire under my ass to finally get around to setting back up local backups (my old "solutions" were pretty ad-hoc, and I'm in the middle of moving hardware around). I'll make it my weekend project, and earmark some of the hard drives in my Horde™ for backing up this board. How frequently said backups occur depends on how much storage this board needs. I'll keep you up to date on that once I start setting things up again. For now though, I'm going to bed.
>>34972 >this lit the fire under my ass to finally get around to setting back up local backups Great! wget or cURL are OK-ish solutions If you have a fetish for a giant filepile in the middle of your drives, lol! :D OTOH, BUMP is quite efficient and organized! in comparison (and Bumpmaster will be better-still [cf. Pageone : >>17561, >>33733 ]). But it's chief benefit is in the simple 'fire & forget' cron job usage scenario. I used to run it 4 (or even 6) times a day for ~60-80 different IBs around teh Interwebs, on-and-off. It typically takes all of seconds to run against any given board (after the initial big-download) -- even across Tor. >tl;dr NoidoDev was pestering me to make it easier to get it up and running for him. I already did a tutorial on getting a box ready years ago (cf. >>5270, ...). Maybe I can use this event as a similar opportunity of a metaphorical 'sharp, pointy-stick to the butt' :D to get me to help him out. Any ideas how I might make setting up BUMP easier for newcomers, Greentext anon? TIA. >=== -minor edit
Edited last time by Chobitsu on 12/18/2024 (Wed) 06:51:34.
>>34972 >How frequently said backups occur depends on how much storage this board needs. IIRC, /robowaifu/ was at about 10GB in size at the time of my most recent archive a couple months ago. That was an observation in passing so YMMV by +/- 100% lol. :^) But again, BUMP will generally only take a few seconds each run (even at 10GB board size), and will only add anything that's actually changed since the previous run.
>>34974 > /IIRC,/ was at about 10GB in size It's absurd that a file that will fit on a cheap usb drive is not backed up. Even if it was a terabyte these drives now cost what, $30 for used enterprise SATA drive. These used enterprise drives are real bargain and it's all I use with the provision that I back everything mostly up. The real solution to this is FREE hosting on your own computer with I2P. The anonymous Internet. At the least you could mirror the site in I2P on whatever computer you have. There's two versions. A java version which is likely safe for this site and another version that is likely more security safe (in the long run) written in C++ but not as user friendly(I2Pd). I2P https://geti2p.net/en/ I2Pd https://i2pd.website/ Though I do not know how to save large documents I2P has a built in torrent system built on "magnet" files that only works in I2P. It does not go to the regular net so it's anonymous and you can download any size of whatever you want. So files that are big, videos of progress, etc. could just put a copy in their download folder and copy the magnet file link and as long as someone is still serving it it will never go away. >BUMP The easiest way to get people to use this is, and I'm assuming it's a command line program, is to put a simple GUI on it. I know programmers hate GUI's but suspect it's because they are such a pain in the ass with hordes of typing(and it let's them thumb their noses at the masses). So do something different. Use Rebol for the GUI with C codebase below it. The easiest way I know to do this without a vast amount of BS, study and heartache is to use Rebol programming language to do so. It can work with C and has a built in DSL for GUI's. It's only a couple megabytes and works with windows, AppleOSx, Linux and the BSD's. Rebol overview http://www.rebol.com/ A few examples https://www.rebol.com/pre-view.html Download https://www.rebol.com/download-view.html Super quick start overview to use http://re-bol.com/rebol_quick_start.html This is a great page where the guy has written simple direct examples that show what it can do in a a few minutes. He also has a whole program he wrote for a secondhand retail shop. "... My most recent little tutorial is a quick introduction to practical Rebol, intended for non-programmers, but fast paced enough for developers coming from other tools (20+ app examples in about 1-2 hours): http:// re-bol.com/starting_computer_programming_with_rebol.html The most in-depth of my R2 tutorials is 850 pages (it includes some materials and explanation not found in the more visible http://re-bol.com tutorial): http://business-programming.com ..." above was quoted from http://www.rebolforum.com/index.cgi?f=printtopic&topicnumber=47&archiveflag=new The link above has more resources More examples http://www.rebol.net/cookbook/ This last one looks interesting to. A library. Sharp looking graphics. Light weight. Can run on mirocontrollers. Light and Versatile Graphics Library(lvgl) https://lvgl.io/ https://lvgl.io/features
where is this "BUMP"?
I tried to compile Bump a few times recently and wasn't able to. I'm currently using NixOS on my old laptop and ran into issues compiling Bump on it. My PC had other issues and I was too frustrated about that and didn't use it for months. But I'm even not sure if I had Bump working on that. I have one version on my Raspi, but the disc I stored the archive files on was only plugged into it from time to time. So it was a few months ago that I used it to archive the board. I don't know how many, but I can look into it. That said, it's certainly more than just two months. I don't recall anyone else on this board ever asking about Bump or stating that they compiled it. Most people here generally still just lurk, and I had some issues, so also stopped doing much. So, let's hope there's a happy surprise. I can of course upload my own files again, but this requires that I know which one these were. Gladly I started naming them more and more. >>34976 >Gui It's not about a GUI, it's something that can run in the background.
>>34979 >it's something that can run in the background Does it scrape the site? I could see something constantly scraping the site causing bandwidth problems.
>>34976 Wasn't there some drama a couple years back when the new owner of the project started doing fishy shit? >>34973 >If you have a fetish for a giant filepile in the middle of your drives You're talking to a man that has a box full of hard drives loaded with old and likely useless LLMs. So, I'm not worried about it. I'll try compiling bumpmaster first, though. >>34974 >10GB Wait, that's it? I have a bag full of MicroSDs larger than that. Once I get everything set up, I'll do daily backups if I can get bumpmaster working. If I can't, then it'll be weekly.
>>34962 This does open an oppurtunity for us to start considering personal backups. You've already started working on software tools that are relevant to this. >>34965 Please continue to be based. On a personal note, write more little stories. Your words mean more than you know to our goals. My personal recommendation is that we start to have redundancies. Websites which host their own /robowaifu/ so that posts can continue when one is down, followed by importing posts between the boards as needed. I do worry this could fracture our tiny, fragile community.
>>34976 Thanks Grommet, good advice all. >Rebol Neat! Didn't know about that one. My intent is to use FLTK, which has an incredibly long history going all the way back to the NeXT & Sun . It runs on basically every hardware platform -- even MCUs that can drive a display (eg, a robowaifu's totally-not-boobas chest display system. :^) >>34977 >where is this "BUMP"? It's always been linked in the OP of our /meta's since it's inception late 2019. >--->note: There's an archiving tool coded right here for /robowaifu/ that provides the ability to backup locally the posts & files from our board. It's named BUMP, and is basically a custom IB scraper. >Latest version of BUMP v0.2g ( >>14866 ) >>34979 >So, let's hope there's a happy surprise. This. As to getting it to compile, I'll work to make sure that the Bumpmaster has fewer dependencies. This should ease your task. >It's not about a GUI, it's something that can run in the background. Also this. Doing a cron job on a terminal program doesn't get much more reliable/simple as a systems maintenance tool. >>34981 >Does it scrape the site? I could see something constantly scraping the site causing bandwidth problems. It does. But because tought went into it, it doesn't consume any more bandwidth than simply downloading the Catalog JSON, and examining. Anything that's further downloaded only happens if there has actually been a change. Takes less than 1MB of bandwidth, and about 5secs of time in the nominal case. >>34983 >I'll try compiling bumpmaster first, though. Actually it's not ready yet. I solved almost every issue needed however, while writing the simple terminal utility Pageone. I'd suggest you have a look into that if you want to see how Bumpmaster is going to be even-faster/more-efficient than BUMP was/is. >>34986 >This does open an oppurtunity for us to start considering personal backups. Indeed! Everyone here please do. <---> Cheers, Anons. :^)
Open file (2.31 MB 2736x1357 screenshot1.png)
Open file (1.95 MB 2736x1395 screenshot2.png)
Open file (2.29 MB 2736x1367 screenshot3.png)
Open file (2.09 MB 2736x1396 screenshot4.png)
Good news! My browser saved the thumbnails. If anything, we can at least look into reuploading the thumbnails for organizational and morale reasons.
If anyone wants any larger screenshots, just ask.
>>34991 I'm not sure how to do this tbh, I realized earlier when talking about this with Greentext anon, GreerTech. AFAICT, the Lynxchan (the server software that /robowaifu/ sits on top of) auto-generates the thumbnails (storing them in a system folder), and then places them when it builds the webpages for the board on-demand. If anyone can figure out how to go about making this work, then by all means give it a shot. If it doesn't work, then hopefully Robi's fix+our reuploads dump will do the trick. Thanks Anon, good thinking! :^)
Can someone explain why some very old pictures on the board still don't work? Is it difficult to restore the whole thing? I can't upload missing newer files, if I don't know what's missing. At some point we'll need some kind of list. Unfortunately, just today my btrfs filesystem on my external disk failed with "parent transid verify failed" https://archive.kernel.org/oldwiki/btrfs.wiki.kernel.org/index.php/FAQ.html#How_do_I_recover_from_a_.22parent_transid_verify_failed.22_error.3F - which is the most feared error. Apparently this mostly happens when WriteCache is enabled and some hardware is wrong. It's outrageous to me that this is possible in 2024. Anyways, I will have to try to use btrfs-restore tomorrow after buying another big disk. For now, I don't know what files I will get back from that disk. Btw, just in case, don't try to give me "just googled" tips on btrfs. Thanks.
>>34994 >Can someone explain why some very old pictures on the board still don't work? I'm pretty sure this is related to why Robi's needs to write an addon to address the entire set of issues, NoidoDev. Anything further, I'm not sure. Used to be if you simply reposted the exact file again, all the regenertation worked automatically. Clearly that's not the case any longer. >disk failed Oh no! Maybe Greentext anon & frens can help out? I don't know myself. I sure hope you find a fix, Anon!
>>34993 so unless you have the original file with the same filehash someone needs to either manually update the src links in the html page to point to a new file or manually add a new file to .media named with the filehash of the missing file
>>34996 Isn't it a database? Why html manipulation? Also, hashing the files is easy, if you have them. If only recent media files are lost, then he should have all others. Assuming my data isn't gone for some unrelated reasons, I should have all the files I uploaded, I just need the names. Anyways, no rush, I was just wondering.
>>34995 Just in case someone uses btrfs file system on an external drive: https://wiki.tnonline.net/w/Btrfs/Parent_Transid_Verify_Failed - consider turning WriteCaching off in advance, out of caution. > hdparm -W 0 /dev/sdX
>>34997 probably, would be the same thing the img links on the html page are the filehash so that must be the entries, can you reuppload a missing file? it might be treated as a duplicate and wont be downloaded if the entries are still in the db
>>35000 Absolute trashfire >>35001 - I upload it, it actually has the same hash, but if I load it again in the original post, it still doesn't load.
Open file (2.64 MB 720x1280 1683064327201873.webm)
Brief aside here, what's the name of this little robot?
>>35006 I found it with the power of reverse image search, Kibo chan.
Open file (168.28 KB 307x466 1434130528889.png)
>>34994 >>34999 I never use btrfs, and failures like this are why. It's been in beta for over a decade, remains unstable, and is, in my opinion, unsuitable for storing anything important. I don't know how to fix your error nor do the developers, I'd wager, but I'd recommend switching to ext4 once you get your data back. I've never had a partition fail before the hardware. If you really need the extra redundancy, then keep multiple seperare copies and/or install mdadm and make a RAID1 software array. >>34986 Thank you, Kiwi. I've been pretty burnt out and distracted with other stuff, but I won't ever give up on writing. There will be more, and I'll definitely have something up by Christmas. >>34989 >Actually it's not ready yet No rush, my setup isn't ready either. I won't be doing any work on that until the weekend, and I have other tasks to run on it once everything's set up again.
>>35008 >No rush, my setup isn't ready either. >I won't be doing any work on that until the weekend, and I have other tasks to run on it once everything's set up again. Just in case it's not clear: I make a distinguishment betwen BUMP, which is ready (such as it has been for years now lol), and Bumpmaster which isn't. The latter should be a nice improvement over the former, but I'm stalled with it's development for the time being. >tl;dr You can install BUMP today. Bumpmaster is TBD. Hope that makes sense, Anon. Cheers. :^)
>>35009 >You can install BUMP today. Bumpmaster is TBD. Nearly everyone here should've installed it for years and ran it in the background. And if it wouldn't work, complained in the thread for it. 10GB is one time, then it only adds the new files. It's not consuming a lot of resources. Thirdworlders with mobile connections are of course exempt from this argument. That said, I since it would also download every "illegal" spam, it might be better to put it into a encrypted folder, just in case the device is lost at some point and someone wants to misinterpret this. I actually wondered if it deletes the old data if something is deleted from the board. Which would be suboptimal on one hand, because backup integrity, but keep these backups without "spam".
>>35006 >>35007 From what I can tell, Kibo-chan is also animated using blender thanks to a plugin that allows for bone rotation to be transformed into servo positions. (some older prototypes of kibo feature little faces singing along with a blender view window) I personally modified the plugin to export animations to raw text files for SPUD. Not sure if I'll ever release that tho cuz its a mess.
>>34757 >>34758 Apologies for the delay, I've been busy these last weeks. As far as decorations go I can't think of anything else. You and CSS anon have made a good job at decorating it. As for the streaming part, if I'm reading this correctly you want to setup a Cytube channel on the site. Is that correct? In any case I'm not sure if I'll be able to make it. When are you guys planning to do that stream?
>>35013 Heh, I'd love to have one for us to represent /robowaifu/ at the /christmas/ party (and maybe introduce some Anons who still don't know yet), but I'm not the guy. I just wanted to throw the idea out there in case someone else wanted to take up the mantle. <---> Regardless, thanks again Anon! I see you're taking good care of us on /christmas/ these days as well. Cheers. :^)
>>35010 >I actually wondered if it deletes the old data if something is deleted from the board. No it doesn't. BUMP is quite simple in such regards, as it literally just writes thread-organized directories of files (including HTMLs & JSONs of the thread pages themselves) for any given board (such as /robowaifu/ ). This is by design for robustness & availability : filesystems are the one 'practically-guaranteed-to-be-there' universal datastore. Which means you could run BUMP on an MCU. This also lends itself directly to any scripting-based approaches for archive management after the fact (which served everyone but the GH & their pets very well when Trashmin used that fact to move boards over with BUMP, when anon.cafe was kill). For Bumpmaster my plan is to offer a few more-sophisticated management approaches (like thread synchronizations), all while still maintaining that plain & simple directory-based system underneath. I hope that answers your questions, NoidoDev. * --- * btw, I have had to go back in and rm directories that BUMP auto-dl'd (via the aforementioned cron jobs) when I later realized that troon glowniggers had posted cp-spam threads against us all. easy peasy fix though; from the board archive's local directory just issue : rm -rf <muh_foo_thread_dirname> simple as. >=== -prose edit
Edited last time by Chobitsu on 12/20/2024 (Fri) 00:13:35.
>>35008 I hit a particular error where the disk makes some errors and messes up the file system beyond repair. > I'd recommend switching to ext4 I'm switching to zfs, without performance options, focusing on integrity. Seems to work on Linux now. Forward ever, backwards never. I consulted with Claude before. >make a RAID1 software array If I would have had two disks, I should just have used the other for regular backups. Which works good with btrfs. I'm nearly sure it would not have copied that error over. I also should have had a error reporting for scrub which I do have running in the background. Thanks anyways. I now got my now overpriced 4TB disk, and start trying to save data and finally sort our all my backups on different drives. Gladly my (newer and old) OpenScad robowaifu files were also on other disks.
Open file (783.87 KB 1920x1080 happy_Yumemi.png)
>>35016 >Bump Generally it's already good that we have it. Thanks. Just unfortunate that I didn't have it on my current main computer and not having the right disk attached to the Raspi which is always on (because my external usb hub was broken). I wish there was a way to trigger the right amount of fear of data loss before it happens. We always need systems in place to prevent it. I even already bought a external BluRay burner and M-Disks for the most important backups, but didn't use it. >>34957 >Christmas online party Yeah, in regards to Christmas. I did look into it and might again, at least around the actual date. I'm not hugely into watching streams, though. I normally have my files on my disk and watch it locally. I should make a list of the apparently recommended Christmas and Winter shows and movies, since it's always good to have some of them. I can also recommend Imagenarium (Nightwish band) movie and Eight Below. I also tend to watch "The Thing" around Wintertime. I don't know much good winter anime so far. I'm not that fond of "A Place Further than the Universe" though it's still kinda good, and I never could watch "Tokyo Godfathers" since I was repulsed by it. The Re:Zero specials are fine, especially Frozen Bond, but not really great. Planetarian also has a winter setting and it even has a somewhat religious vibe to it.
>>34989 >But because tought went into it, it doesn't consume any more bandwidth than simply downloading the Catalog JSON, and examining. Anything that's further downloaded only happens if there has actually been a chan I see, said the blind man. I didn't know that there was such a catalog/database/whatever that listed all the post. Very nice!
>>35020 Yep. This is typically known as an IB's 'API' (though why that is I don't know, and it's probably silly anyway :D). https://alogs.space/robowaifu/catalog.json
>>34962 Hey Chobitsu, please contact me on IRC when you get the chance. Thanks.
>>35022 Sure thing, Robi ! I'll have to wait until tomorrow sometime, however. Do you have a preferred UTC time for such a contact. In other words, what time would work best for you in say 12+ hrs from now?
>>34971 >Maybe things have changed on the alogs/Lynxchan backend that keep this from 'automagically' working now? The problem here is that LynxChan only creates a file on the disk if there is no upload reference for it. It reference counts all uploaded files and just changes the reference count if a new post with the same file as before has been uploaded. This saves on disk I/O if the file system is consistent with the database, but obviously it's not consistent now and so it will simply discard the file you uploaded even though it's not actually present. I'm working on a LynxChan addon that modifies this process so that even if an upload reference is present, if there's no corresponding entry in the "fs.files" collection in MongoDB, it will force-create the file on the disk (and an entry on the database).
>>35023 Currently I'm best reachable in 04:00-07:00 UTC (morning for me), and 15:00-21:00 UTC (afternoon/evening). Just hit me up within those times.
>>35024 >I'm working on a LynxChan addon that modifies this process so that even if an upload reference is present, if there's no corresponding entry in the "fs.files" collection in MongoDB, it will force-create the file on the disk (and an entry on the database). Got it! This sounds like a workable approach. We have several technically-knowledgable Anons here, I'm sure they appreciated knowing such things. I know I do. >>35025 OK I"ll plan to drop in and give you an 'AUUUUUGHHHH' today sometime during the next few hours. Probably between 17h - 19h UT, I'd guesstimate. Cheers.
>>34962 Hope your degree is going well fren. What are you studying? I'm currently in school for Data Science so I'll eventually be useful for AI dev. Cheers
>>35025 Robitt, what's the channel name again? #my jewish motherworld doesn't seem to be working for me r/n. >>35030 Nice! Glad to hear it Kiwi. I expect you'll master R, eventually ehh? Just remember at the end of the day, we need to fit 98% of all of that yuge data down onto onboard systems for our robowaifus otherwise, she won't be able to climb the nearest mountain peaks for lunches together!!! :^). >tl;dr > Fair warning: Don't ever trust the Globohomo with your robowaifu's data!! >ttl;dr < Pls don't let your professors or peers convince you that the cloud is a good thing! >Hope your degree is going well fren. What are you studying? Thanks! You too, fren. Trying to catch up on maths stuff, so I can move on from the 9th grade haha. Also, I want to be an a*rsehole robowaifu-industry billionaire and hire everyone here, so I'm focusing some attention on business. TWAGMI
>>35021 You can only even assume this might be silly if you never tried to parse html with some script. The html can change, btw. It should rather be mandatory by law.
>>35039 The main problem with a pure REST API is that it does not allow for actions to be presented. What you get is the data, and the actions you can perform on it is out-of-band (some other request you can send, described in the API documentation). Take this thread for example; in the JSON you only get the individual posts, which is fine if you only need the data, but it does not show a posting box, which posting fields are required, etc. I think a much better approach could be if the HTML output was semantically useful, with a stable structure for each item on the page; for example, if each post contained the same data that the JSON response contained, plus the actions you can perform on it, then you would be able to discover and perform actions based on what the post contained. Stuff like htmx are exploring this approach and it's quite interesting. One downside though is as you mentioned, parsing HTML correctly is quite a bit harder than JSON, but any respectable language is going to have a compliant HTML parser anyway (BeautifulSoup or lxml on Python, dom-parser in JS, ...).
>>35040 Yeah, I was only referring to backups and data scraping. Anything beyond that needs to be separate.

Report/Delete/Moderation Forms
Delete
Report