I can’t tell if this is sarcasm or not.
Just in case it isn’t - it’s obviously reference to late DMX’s legendary song with the same title.
I can’t tell if this is sarcasm or not.
Just in case it isn’t - it’s obviously reference to late DMX’s legendary song with the same title.
SUVs and crossovers are most popular cars in Europe.
Here’s a French source, coincidentally: https://www.lemonde.fr/en/economy/article/2023/09/14/suvs-now-dominate-european-car-market_6135326_19.html
Sadly, it seems Meta made some good decisions in LLM area at least. Llama is a de facto base of most good 7b and 13b LLMs, the ones that have potential on running on low grade hardware.
The conclusion of the research is that solution energy efficient and cheaper. Smart bulbs are nice, but they solve neither of the issues mentioned. They need to be powered on all the time and you still need the switches either way, unless you design your home to be solely smartphone controlled but nobody does that.
I was 7 and I found a booklet for BASIC programming language that came with my C64. I was fucking ecstatic that I could just write words and make computer go brrrr. My first project was my school schedule in a form of a table created with n number of print statements. I felt like a fucking wizard.
I think sadly you’d either way get much better performance with proprietary drivers especially if the focus is generative AI.
I do self-host some services but it bugs me that a lot of articles that talk about costs do not factor in a lot of additional costs. Drives for NAS need replacement. Running NUCs means quite an energy draw compared to most ARM based SBCs.