I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I know a google engineer who was saying they’re having to update their code bases to handle > 16 exabytes of storage, if you can imagine. But yeah, that’s storage, not RAM.
I started in C and switch to C++. It’s easy to think that the latter sort of picked up where the former left off, and that since the advent of C++11, it’s unfathomably further ahead. But C continues to develop and occasionally gets some new feature of its own. One example I can think of is the restrict
key word that allows for certain optimizations. Afaik it’s not included in the C++ standard to date, though most compilers support it some non-standard way because of its usefulness. (With Rust, the language design itself obviates the need for such a key word, which is pretty cool.)
Another feature added to C was the ability to initialize a struct
with something like FooBar fb = {.foo=1, .bar=2};
. I’ve seen modern C code that gives you something close to key word args like in Python using structs. As of C++20, they sort of added this but with the restriction that the named fields have to come in the same order as they were originally defined in the struct, which is a bit annoying.
Over all though, C++ is way ahead of C in almost every respect.
If you want to see something really trippy, though, have a look at all the crazy stuff that’s happened to FORTRAN. Yes, it’s still around and had a major revision in 2018.
I guess the MAC address guy is up next. 48 bits may not go so far if every light bulb is going to want its own.
Imagine if you were the guy who made the call on IPv4 addresses…
Falsehoods About Time
Having a background in astronomy, I knew going into programming that time would be an absolute bitch.
Most recently, I thought I could code a script that could project when Easter would land every year to mark it on office timesheets. After spending an embarrassing amount of…er…time on it, I gave up and downloaded a table of pre-calculated dates. I suppose at some point, assuming the code survives that long, it will have a Y2K-style moment, but I didn’t trust my own algorithm over the table. I do think it is healthy, if not essential, to not trust your own code.
Falsehoods About Text
I’d like to add “Splitting at code-point boundary is safe” to your list. Man, was I ever naive!
So the next captcha will be a list of AI-generated statements and you have to decide which are bat shit crazy?
“Recall uses Copilot+ PC advanced processing capabilities to take images of your active screen every few seconds,”
Seems like a lot of extra disk thrashing that would shorten the life expectancy of an SSD? Like it would be considerably more than your usual background chatter of daemons writing to log files and what not. Unless I’m misunderstanding this?
We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.
I’m with you on this one. There are lyrics on almost every single track for crying out loud. Throw us instrumental lovers a bone won’t you? Songs that are lyrically driven but are otherwise super-repetitive instrumentally tend to put me to sleep.
What I love about concerts is when the band goes off script and just starts jamming. Even a 5-minute drum solo will have me grinning ear to ear, and that’s what I’ll be remembering on the way home.
I think I could get very nervous coding for the military, depending on what sort of application I was working on. If it were some sort of administrative database, that doesn’t sound so bad. If it were a missile guidance system, on man! A single bug and there goes a village full of civilians. Even something without direct human casualties could be nerve-wracking. Like if it were your code which bricked a billion-dollar military satellite.
Speaking of missile guidance systems, I once met someone who worked a stint for a military contractor. He told me a story about a junior dev who discovered an egregious memory leak in a cruise missile’s software. The senior dev then told him “Yeah, I know about that one. But the memory leak would take an hour before it brings the system down and the missile’s maximum flight time is less than that, so no problem!” I think coding like that would just drive me into some OCD hell.
I have only written potentially life-threatening code once in my life. It had to do with voltage/current regulation in the firmware of a high-powered instrument used by field workers at the company where I work. It was a white-knuckled week I spent on just a single page of code, checking and re-checking it countless times and unit testing it in every conceivable way I could imagine.
The city where I live has a musical instrument lending library. I don’t know how common these are? Ours started when a cherished local musician passed away and his eclectic collection became the library. Over the years, more people have donated instruments and there is an annual festival to raise funds for their upkeep. (As a local musician, I’m actually playing at said festival today.)
Anyway, it works just like a regular library. You get your library card and check out an instrument and it doesn’t cost you a penny. And there are all kinds of videos online these days to give you pointers on how to play. I guess if you get really serious, you’ll probably want some one-on-one tutoring, but if you’re just doing it for kicks and don’t have any plans to join a band or whatever, you can just have some fun and see how far you can get on your own?
Fair enough. I’m just looking for some independent confirmation as this is pretty big news.
Is this official though, or wishful thinking on the part of Cameron?
I don’t live in Scotland, but I can’t even imagine what it must’ve been like to have that close referendum followed by Brexit only a couple of years later.
What I’m wondering about right now though is Irish unification? That seems to be building up some serious momentum from everything I’ve been reading.
I don’t really have an answer for you, but can say when recompiling older codebases (some in C and some in C++) using a modern C++ compiler, typing errors are among the most common I have to address. In particular, compilers seem to insist more on explicit casts for type narrowing, which is a good thing. But I don’t know about modern C itself? It wouldn’t surprise me if the language has become stricter.
This was a struggle for me going from hobbyist programmer to working at a company. I tried to tone it down. Really. But eventually I got “promoted” to having my own office with a suspiciously thick door. Hmm…
True story. I was looking for an answer to an obscure problem and found it in a 10-year-old stackoverflow post. Then I looked more closely at the author…
Hey! Me from 10 years ago, stop being such a smart ass! It’s obnoxious.
1st reaction: lmao
2nd reaction: hey wait, this is pure genius!
You can always combine integer operations in smaller chunks to simulate something that’s too big to fit in a register. Python even does this transparently for you, so your integers can be as big as you want.
The fundamental problem that led to requiring 64-bit was when we needed to start addressing more than 4 GB of RAM. It’s kind of similar to the problem of the Internet, where 4 billion unique IP addresses falls rather short of what we need. IPv6 has a host of improvements, but the massively improved address space is what gets talked about the most since that’s what is desperately needed.
Going back to RAM though, it’s sort of interesting that at the lowest levels of accessing memory, it is done in chunks that are larger than 8 bits, and that’s been the case for a long time now. CPUs have to provide the illusion that an 8-bit byte is the smallest addressible unit of memory since software would break badly were this not the case, but it’s somewhat amusing to me that we still shouldn’t really need more than 32 bits to address RAM at the lowest levels even with the 16 GB I have in my laptop right now. I’ve worked with 32-bit microcontrollers where the byte size is > 8 bits, and yeah, you can have plenty of addressible memory in there if you wanted.