/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

More

(used to delete files and postings)


Have a nice day, Anon!


Open file (14.96 KB 280x280 wfu.jpg)
Beginners guide to AI, ML, DL. Beginner Anon 11/10/2020 (Tue) 07:12:47 No.6560
I already know we have a thread dedicated to books,videos,tutorials etc. But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. That is why I thought we would need a thread like this. Assuming that I only have basic programming in python, dedication, love for robowaifus but no maths, no statistics, no physics, no college education how can I get advanced enough to create AI waifus? I need a complete pathway directing me to my aim. I've seen that some of you guys recommended books about reinforcement learning and some general books but can I really learn enough by just reading them? AI is a huge field so it's pretty easy to get lost. What I did so far was to buy a non-english great book about AI, philosophycal discussions of it, general algorithms, problem solving techniques, history of it, limitations, gaming theories... But it's not a technical book. Because of that I also bought a few courses on this website called Udemy. They are about either Machine Learning or Deep Learning. I am hoping to learn basic algorithms through those books but because I don't have maths it is sometimes hard to understand the concept. For example even when learning linear regression, it is easy to use a python library but can't understand how it exactly works because of the lack of Calculus I have. Because of that issue I have hard time understanding algorithms. >>5818 >>6550 Can those anons please help me? Which resources should I use in order to be able to produce robowaifus? If possible, you can even create a list of books/courses I need to follow one by one to be able to achieve that aim of mine. If not, I can send you the resources I got and you can help me to put those in an order. I also need some guide about maths as you can tell. Yesterday after deciding and promising myself that I will give whatever it takes to build robowaifus I bought 3 courses about linear alg, calculus, stats but I'm not really good at them. I am waiting for your answers anons, thanks a lot!
Open file (1.04 MB 1952x1508 ClipboardImage.png)
Open file (314.05 KB 1024x747 ClipboardImage.png)
Open file (697.77 KB 3300x2550 ClipboardImage.png)
Open file (136.05 KB 1650x1275 ClipboardImage.png)
Open file (575.21 KB 1250x884 ClipboardImage.png)
Open file (287.38 KB 800x566 ClipboardImage.png)
Open file (463.83 KB 1250x883 ClipboardImage.png)
Open file (1.12 MB 1250x883 ClipboardImage.png)
Open file (559.02 KB 1753x1240 ClipboardImage.png)
Open file (482.82 KB 1250x884 ClipboardImage.png)
Open file (1.19 MB 1250x966 ClipboardImage.png)
Open file (1.18 MB 1250x966 ClipboardImage.png)
Open file (1.15 MB 1250x966 ClipboardImage.png)
Open file (1.13 MB 1250x966 ClipboardImage.png)
Open file (579.35 KB 1250x884 ClipboardImage.png)
Open file (491.45 KB 1250x884 ClipboardImage.png)
Open file (1.19 MB 1250x966 ClipboardImage.png)
Open file (1.46 MB 1250x966 ClipboardImage.png)
Open file (1.10 MB 1250x883 ClipboardImage.png)
Open file (285.42 KB 1000x704 ClipboardImage.png)
Open file (347.72 KB 1250x693 ClipboardImage.png)
Open file (327.08 KB 1250x975 ClipboardImage.png)
Open file (261.90 KB 850x1100 ClipboardImage.png)
Open file (165.69 KB 875x555 ClipboardImage.png)
Open file (606.29 KB 1652x1250 ClipboardImage.png)
Okay, thanks. That's the better way to do it.
>>8256 Y/w. There's still the pdfs to get if you're interested.
Here's a good example for a little lecture on how to understand some AI problem better: https://nitter.dark.fail/svpino/status/1357302018428256258#m - I saved it as a pdf, no idea if that's usefull, but that Nitter instance might not be there forever.
>>8447 Neat, thanks Anon.
Open file (81.70 KB 1378x562 IMG_20210208_184145.jpg)
Open file (164.53 KB 594x1937 IMG_20210208_090120.jpg)
Open file (65.89 KB 1276x846 IMG_20210208_072837.jpg)
Open file (238.86 KB 1200x927 IMG_20210208_071849.jpg)
Posted on Twitter by data scientists.
>A curated list of awesome, free machine learning and artificial intelligence courses with video lectures. All courses are available as high-quality video lectures by some of the best AI researchers and teachers on this planet. >Besides the video lectures, I linked course websites with lecture notes, additional readings and assignments. https://github.com/luspr/awesome-ml-courses
>>8495 Thanks! That's a lot of content. It would be great if we can find someplace to keep this stuff safe off of Youtube somewhere.
Has anyone here read the book "Artificial Intelligence: A Modern Approach"? Any thoughts on it?
>>8609 Not yet, but noted in case I haven't already. I guess I downloaded and noted the name of reading material which would take me 10yrs or more to read and work through, though. I'll start with anything beyond simple chatbots, image and voice recognition when I mastered all these basic things, and the stuff I'm already working on, and when have a functional female body which needs to get smarter. Which will be the case in 5-10yrs, I hope. Related: Uploaded file and https://github.com/aimacode
I'll probably go with that, as soon as I have the time: >Dive into Deep Learning: An interactive deep learning book with code, math, and discussions, based on the NumPy interface. >Implemented with NumPy/MXNet, PyTorch, and TensorFlow >Adopted at 175 universities from 40 countries >Each section is an executable Jupyter notebook >You can modify the code and tune hyperparameters to get instant feedback to accumulate practical experiences in deep learning. http://d2l.ai/
Open file (94.73 KB 1362x1104 IMG_20210301_213941.jpg)
Open file (87.07 KB 1267x710 IMG_20210222_233049.jpg)
Open file (137.51 KB 577x385 IMG_20210219_162908.jpg)
Open file (162.50 KB 1588x938 IMG_20210218_113525.jpg)
Open file (100.78 KB 700x1458 IMG_20210313_234930.jpg)
Open file (148.21 KB 1080x1080 IMG_20210313_185721.jpg)
Open file (105.91 KB 577x385 IMG_20210325_191728.jpg)
Open file (64.24 KB 784x427 IMG_20210325_171914.jpg)
Open file (178.76 KB 2334x1025 IMG_20210322_143128.jpg)
Open file (97.96 KB 1207x842 IMG_20210322_084800.jpg)
>>9266 > #1 That actually helps me understand that term a lot better now. I still don't understand the implications of one way or other, but at least I kind of have an idea what it means now. > #2 I know exactly what that is -- personally -- but I can also add that eventually you'll holistically put everything together in your head if you just keep moving forward. Don't quit and you'll succeed. > #3 Somehow, I can understand an idea when it's represented graphically like that, rather than a bunch of 'indecipherable' math symbols of a formula. I don't think I'm alone in this either. > #4 LOL. Surely this is a simple thing to pick up over the weekend. R-right, Anon? :^)
>>9274 Tha last pic is more about having an overview and being aware that there's more than deep learning. In some cases another thing might work better or be good enough but easier to archive. It's also useful to use these terms to look for videos to get an idea about something. Maybe it fits to a problem that one wants to solve.
>>9278 Makes sense, thanks Anon. It's a big field.
Open file (220.57 KB 1199x540 IMG_20210331_191630.jpg)
Open file (52.91 KB 1404x794 IMG_20210331_191334.jpg)
> completely removing the background of a picture (robust PCA) > PCA's main goal: dimensionality reduction. >You can take a bunch of features that describe an object and, using PCA, come up with the list of those that matter the most. >You can then throw away the rest without losing the essence of your object. https://nitter.dark.fail/svpino/status/1377255703933501445
>>9375 That's pretty powerful. I imagine glowniggers are using this idea extensively for surveillance isolation. Not only would this work with a 'pre-prepared' empty background plate for extraction, but a separate system could conceivably create (and keep updated under varying lighting conditions, say) an 'empty' plate from a crowded scene simply by continuously sweeping the scene and finding areas that don't change much frame-to-frame. These blank sections can then all be stitched together to create the base plate to use during main extraction process. Make sense? Ofc, a robowaifu can use this exact same technique for good instead to stay alert to important changes in a scene, alerting her master to anything she sees that might be an impending threat, or even take action herself to intervene. Simplification is the key to both understanding, and to efficiency in visual processing and other areas.
Edited last time by Chobitsu on 05/10/2021 (Mon) 01:01:13.
Related: GPT-2 for beginners >>9371
Illustrated guide to transformers, a step by step introduction: https://youtu.be/4Bdc55j80l8
Edited last time by Chobitsu on 05/10/2021 (Mon) 00:59:53.
Open file (18.21 KB 1600x900 IMG_20210514_143007.jpg)
Open file (24.74 KB 1600x900 IMG_20210514_143019.jpg)
Open file (18.13 KB 1600x900 IMG_20210514_143028.jpg)
Open file (47.79 KB 2400x1350 IMG_20210514_165116.jpg)
Open file (79.86 KB 2400x1350 IMG_20210514_165319.jpg)
This could also fit in the math thread, but it's notation used in ML.
>>6560 Heres a wholistic beginners understanding all that you see, the framework that people take for machine learning is to build a mathematical function that can make a prediction. That mathematical function can be created in many ways, at the end of the day its supposed to provide you with a value of some kind that you action, or output. More recently that function is created using deep learning = parametric system that learns how to capture data and create regions between high dimensional datapoints, to segment partitions of that data to do classification or make predictions through regression, obviously there are many other way to this these are the high level constructions. I would suggest you buy grokking deep learning by Andrew trask, he gives you a really good deep insight into DL, In practice however, a lot of the algorithms we use supplement DL techniques we generally use some older ML algorithms and Feature Engineer through PCA or various other Engineering techniques.
Open file (60.57 KB 700x355 p_value.png)
Open file (6.89 KB 446x291 variance.png)
10 Must-Know Statistical Concepts for Data Scientists: https://www.kdnuggets.com/2021/04/10-statistical-concepts-data-scientists.html
>related crosspost (>>10613, >>10614)
>>10607 Thanks very much Anon! Bookmarked.
Open file (120.30 KB metan.jpg)
Hey /robowaifu/, I am back after such a long break! Damn... I have created this thread back in 2020... time does not really wait for us I guess. Well, not much of a process has been made. When life makes you face with much tougher situations you can't think of anything else lol. But I'm back, I am studying at a University too! I started with studying some maths. Algebra, Trig, Precalculus, Stats and Calculus. I remember most of the topics from my high school years but I have not looked them up even once for the past 2 years or something. I also found the Springer book Deep Generative Modelling: https://link.springer.com/book/10.1007/978-3-030-93158-2 The book might look frustrating at first but really you can simplify the wishy washy math stuff into basic logic and understand what is going on. The book is also supported by code segments so it is a lot easier to understand. And there is also this book for those who might be interested: https://artint.info/ It follows a similar way of teaching and provides code for everything but in my case it is harder to follow than the Springer book. So yeah, pick whichever you would like. Seeing the process you people have been doing, I am kinda disappointed to not be able to contribute... I will be around from now on hopefully, in my spare time I will try to contribute to the board.... It feels nice to be back. >=== -minor date edit
Edited last time by Chobitsu on 05/26/2022 (Thu) 07:56:22.
>>16409 Welcome back! It's never too late to contribute Beginner Kun. Looking forward towards your contributions. Here's a neat inexpensive way to get into low-cost AI; the ESP32 S3 has AI acceleration. https://www.cnx-software.com/2021/08/29/esp32-s3-ai-capabilities-esp-dl-library/ Here's a GitHub that could help those interested: https://github.com/espressif/esp-dl
>>16409 Welcome Back Anon! >Seeing the process you people have been doing, I am kinda disappointed to not be able to contribute... I felt the same way when I first found the board. Its better to just jump in at any time you feel ready.
>>16409 Welcome back, OP! So, if you recall, OP, your posts not only started a nice, important thread, but it also lead indirectly to another important thread. Namely, our Library Thread (>>7210). So thanks! :^) >now at Uni I'll be praying for you Anon, haha. I too am currently undertaking a regime of maths (>>16302) so maybe we can share notes. Thanks for the books, I've looked briefly so far into the 2nd one. >Seeing the process you people have been doing, I am kinda disappointed to not be able to contribute... >I will be around from now on hopefully, in my spare time I will try to contribute to the board.... It feels nice to be back. Glad to have you back! >>16410 Heh, I can clean this up for you if you'd like Anon. :^) >=== -add 'regime of maths' cmnt
Edited last time by Chobitsu on 05/24/2022 (Tue) 06:34:05.
>>16420 Neat, thanks for the info Anon.
Hey Chobitsu! I am glad to see you again, yeah please go on and fix the post. > So thanks! :^) If it wasn't for you and the great contributors of the board, I would not have a place to post that so I thank you! And the library thread was really necessary, I wish that the board had a better search function as well. I was trying to find some specific posts and it took me a long while to remember which threads they were on. > so maybe we can share notes My University provided me with a platform full of questions. So basically, they have like 250 types of questions for precalculus for instance. The system is automated and what it does is to generate infinite number of questions of that specific type of question and also explains the solution for every question. It changes the variables randomly and gives you space to solve as much as you want. I believe that the platform requires money for independent use. Besides that, I just study from Khan Academy, but the book you mentioned caught my interest. I will probably look into it. If I ever find any good books on that matter, I will make sure to share them with you.
>>6560 >But there are a lot of resources there and as a beginner it is pretty confusing to find the correct route to learn ML/DL advanced enough to be able contribute robowaifu project. I can give a relatively uncommon advice here: DL is more about intuition + engineering than theory anyway. Just hack things until they work, and feel good about it. Understanding will come later. Install pytorch and play with tensor api, go through their basic tutorial https://pytorch.org/tutorials/beginner/nn_tutorial.html while hacking on it and trying to understand as much as possible. Develop a hoarder mentality: filter r/MachineLearning and github.com/trending/python for cool repos and models, try to clone & run them, fix and build around. This should be a self-reinforcing activity, you should not have other dopaminergic timesinks because you will go the way of least resistance then. Read cool people's repos to get a feeling of the trade: https://github.com/karpathy https://github.com/lucidrains Read blogs http://karpathy.github.io/ https://lilianweng.github.io/posts/2018-06-24-attention/ https://evjang.com/2021/10/23/generalization.html https://www.gwern.net/Scaling-hypothesis https://twitter.com/ak92501 When you feel more confident, you can start delving into papers on your own, use https://arxiv-sanity-lite.com/ https://www.semanticscholar.org/ https://inciteful.xyz/ to travel across the citation graph. git gud.
>>16460 >If it wasn't for you and the great contributors of the board, I would not have a place to post that so I thank you! Glad to be of assistance Anon. >And the library thread was really necessary, I wish that the board had a better search function as well. Agreed. >I was trying to find some specific posts and it took me a long while to remember which threads they were on. You know Beginner-kun, if you can build programs from source, then you might look into Waifusearch. We put it together to deal with this conundrum. Doesn't do anything complex yet (Boolean OR), but it's fairly quick at finding related posts for a simple term. For example to lookup 'the Pile', pic related is the result for /robowaifu/ : > >-Latest version of Waifusearch v0.2a >(>>8678) >My University provided me with a platform full of questions >It changes the variables randomly and gives you space to solve as much as you want. Sounds handy! >Besides that, I just study from Khan Academy, but the book you mentioned caught my interest. I will probably look into it. If I ever find any good books on that matter, I will make sure to share them with you. Thanks! Looking forward to your progress Anon. I chose Euler b/c Anon recommended him, so I scanned information about the man. He was a real genius, and the story behind that book's creation is incredible tbh. Seems like a good fit for me personally to start out with, time will tell heh. :^)
>>16464 Thanks for all the great links & advice Anon, appreciated.
>Chobitsu Thanks for the great board, and your patience... I had an idea that it may be fruitful to download some of these training data sets. I have seen some. Like a Full Reddit, 4chan, stuff like that. The reason is that I saw the other day one of these big sets, I think Reddit, saying they need to be paid for them. So I think you can still get this stuff but maybe not forever. While looking for links I found this. https://academictorrents.com/browse.php So this is a huge mass of torrents on academic subjects and low and behold one the first ones was Udemy - Artificial Intelligence A-Z Learn How To Build An AI https://academictorrents.com/details/750ab85e01a0d443bd0d19e49b250f8896e1e791 Now I don't know enough to judge but this might be a good way to get a toehold on AI stuff. I thought I would mention it so the link "might" be helpful to someone. I'm downloading it.
Here's Reddit Reddit comments/submissions 2005-06 to 2022-12 1.99TB I need a new hard drive for this https://academictorrents.com/details/7c0645c94321311bb05bd879ddee4d0eba08aaee I wish there was a set of 1950's home economics course books. That would be idea for a waifu. They used to train girls to treat Men well. Have you seen any of the home-econ videos from the fifties. Girls just laugh and laugh at this but...it would be so nice. So pleasant to be treated like they show. What the hell I decided to add a few links to these. I don;t know how you could train the waifu on these videos but if you could it would be great. Sexist PSAs From The '40s and '50s Show How Far Women Have Come _ NowThis https://www.youtube.com/watch?v=_-OAIAhBiHc "Tips for a Happy Marriage" old 1950s film https://www.youtube.com/watch?v=32hRC1li-T0 How They Raised Girls To Be Women In The 1950s https://www.youtube.com/watch?v=HW5FMRJesC8 Good Wife's Guide Training Video(a spoof but lots of real videos were just like this and used to train young girls) https://www.youtube.com/watch?v=DB5TOsS5EyI The next one of course has been wiped https://web.archive.org/web/20170430093822/http://goodwifesguide.co.uk:80/tag/wife-training/ If we could save a bunch of this stuff and use it for training it might be invaluable.
>>23129 Nprb Anon. We're all in this together. :^) >>23131 >I wish there was a set of 1950's home economics course books. That would be idea for a waifu. Great idea Anon! I sure hope you can locate mountains of this stuff. Let's see what we can manage with it if you do. Thanks for all the helpful links Grommet. Cheers.
I don't know anything about this but it recommended by a programmer with 40 years experience. I read his forum and he talks about various stuff he's doing. He writes sorts of stuff for various companies. He also is using AI to write software and port older code he has. It's working well for him. He suggested the libraries they have are useful. tinyML Foundation The community for ultra-low power machine learning at the edge https://www.tinyml.org/ I have no idea of their capabilities but it sure sounds like something that could be helpful. They are doing AI stuff with micro controllers, so they say. He also suggested this video https://www.youtube.com/watch?v=tL1zltXuHO8
>>23409 Thanks Grommet! Here's a platform-specific repo for the platform, opensauce. https://github.com/Efinix-Inc/tinyml And a general search for it on SJWH*b: https://github.com/topics/tinyml Harvard even has an open course on the topic pll.harvard.edu/course/fundamentals-tinyml Cheers. :^)
>>23414 Another, maybe, different one. https://www.tensorflow.org/lite I think it's the same or possibly a offshoot. It does mention it in the link you provided.
The AI Engine That Fits In 100K https://hackaday.com/2023/08/02/the-ai-engine-that-fits-in-100k/ Of course they're hyping this a bit but...it could be useful. I was reading the other day that like 99% of the neuron capacity is useless in present algorithms. I read it here >>3031 that, "...it now takes 44 times less compute to train a neural network... algorithmic progress has yielded more gains than classical hardware efficiency..." "..By 2028 we will have algorithms 4000x more efficient than AlexNet." So maybe it's not all hype. Maybe there is some big increase in efficiency. Maybe even enough that we could get some sort of interactive waifu, at a child level, with commodity processors. Abstract https://research.nvidia.com/labs/par/Perfusion/ paper https://arxiv.org/abs/2305.01644
This talks shows some ways to get into it and what would be a good thing to work on: https://youtu.be/5Sze3kHAZqE - He recommends fine tuning and transfer learning. Bit skeptical about RAG. He mentions cerebras/btlm-3b as a underappreciated one around the 01:06:00 timestamp. Pretty sure it has been recommend to me through this board or the Discord, since I already had the paper. Phi-1.5 is also mentioned later, which seems to work for Python snippets but not much more. >Fast.ai’s “Practical Deep Learning” courses been watched by over 6,000,000 people, and the fastai library has over 25,000 stars on Github. Jeremy Howard, one of the creators of Fast, is now one of the most prominent and respected voices in the machine learning industry; but that wasn’t always the case... Read the full show notes here: https://www.latent.space/p/fastai 0:00:00 Introduction 0:01:14 Jeremy’s background 0:02:53 Founding FastMail and Optimal Decisions 0:04:05 Starting Fast.ai with Rachel Thomas 0:05:28 Developing the ULMFit natural language processing model 0:10:11 Jeremy’s goal of making AI more accessible 0:14:30 Fine-tuning language models - issues with memorization and catastrophic forgetting 0:18:09 The development of GPT and other language models around the same time as ULMFit 0:20:00 Issues with validation loss metrics when fine-tuning language models 0:22:16 Jeremy’s motivation to do valuable work with AI that helps society 0:26:39 Starting fast.ai to spread AI capabilities more widely 0:29:27 Overview of fast.ai - courses, library, research 0:34:20 Using progressive resizing and other techniques to win the DAWNBench competition 0:38:42 Discovering the single-shot memorization phenomenon in language model fine-tuning 0:43:13 Why fine tuning is simply continued pre-training 0:46:47 Chris Lattner and Modular AI 0:48:38 Issues with incentives and citations limiting innovation in research 0:52:49 Joining AI research communities through Discord servers 0:55:23 Mojo 1:03:08 Most exciting areas - continued focus on transfer learning and small models 1:06:56 Pushing capabilities of small models through transfer learning 1:10:58 Opening up coding through AI to more people 1:13:51 Current state of AI capabilities compared to computer vision in 2013 - lots of basic research needed 1:17:08 Lightning Round >BTLM-3B-8k-base: Licensed for commercial use (Apache 2.0). State of the art 3B parameter model. Provides 7B model performance in a 3B model via performance enhancements from ALiBi, SwiGLU, maximal update parameterization (muP) and the the extensively deduplicated and cleaned SlimPajama-627B dataset. Fits in devices with as little as 3GB of memory when quantized to 4-bit. One of few 3B models that supports 8k sequence length thanks to ALiBi. Requires 71% fewer training FLOPs, has 58% smaller memory footprint for inference than comparable 7B models. https://huggingface.co/cerebras/btlm-3b-8k-base (search for other variants)

Report/Delete/Moderation Forms
Delete
Report