Stranger things char?
Stranger things char?
I’m migrating millons of encrypted credit cards from one platform to another (it’s all in the same company, but different teams, different infra, etc).
I’m the one responsible for decrypting each card, preparing the data in a CSV, and encrypting that CSV for transit. Other guy is responsible for decrypting it, and loading it into the importer tool. The guy’s technical lead wanted me to generate the pair of keys and send him the private key, since that way I didn’t have to wait for the guy and “besides, it’s all in the same company, we’re like a family here”.
Of course I didn’t generate the key pair and told them that I didn’t want to ever have access to the private key, but wow. That made me lose a lot of respect for that tech lead.
Sennheiser headphones that I bought for about $20 about 10 years ago. The cable is indestructible. I once had to resolder it to the speakers because it my cat pulled it out, but the cable itself has endured all kinds of abuse without breaking. And the sound is fantastic.
I think you can also synthesize it from ipomoea seeds, which have LSA, so it’s a much shorter path
I don’t have a story, but I have a setting that’s been really underutilized.
We, as a species, were nomadic for hundreds of thousands of years. We had tens of thousands of years of cohabitation with other hominids, intermingling, making important and powerful discoveries, exploring, etc.
And there’s no one using that for a story?
The only stories about that time that I see are about dumb cavemen or ice age migration.
These people had the same mental capacity that we have now. They had culture, rituals, trade, interesting things.
So yeah, a story set in the prehistoric times.
When I first started learning how to code 9 months ago […]
You do it in teams and call your workmate!
Tomb of the Necrodancer is a rythm rogue like which is very unique, imo.
These kind of forums don’t store the plaintext password, they send an email while in memory, and hash them afterwards. Still bad security, but it’s not storing it in plaintext.
I was thinking… What if we do manage to make the AI as intelligent as a human, but we can’t make it better than that? Then, the human intelligence AI will not be able to make itself better, since it has human intelligence and humans can’t make it better either.
Another thought would be, what if making AI better is exponentially harder each time. So it would be impossible to get better at some point, since there wouldn’t be enough resources in a finite planet.
Or if it takes super-human intelligence to make human-intelligence AI. So the singularity would be impossible there, too.
I don’t think we will see the singularity, at least in our lifetime.