• 1 Post
  • 168 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle


  • Having had one of the old Windows phones with a keyboard dumped on me at an old workplace, can confirm it’s completely possible for a phone to have a keyboard and be a complete piece of shit.

    A good phone with a good keyboard may have some use cases. If you do a lot of writing but not any more computing power or screen space than a phone has, plus you want to be doing that on the move, then yeah. For me, can shitpost on forums using my phone in my spare time, and dealing with on-call work issues - having multiple tabs of Jira and Slack open, for instance - just isn’t really practical on a small screen.

    If your job is very email-centric, then yeah, sure. Blackberry were very good for just having the stuff you need - email, vpn, ‘corporate’ office documents - in a form that worked.



  • Writing in ASM is not too bad provided that there’s no operating system getting in the way. If you’re on some old 8-bit microcomputer where you’re free to read directly from the input buffers and write directly to the screen framebuffer, or if you’re doing embedded where it’s all memory-mapped IO anyway, then great. Very easy, makes a lot of sense. For games, that era basically ended with DOS, and VGA-compatible cards that you could just write bits to and have them appear on screen.

    Now, you have to display things on the screen by telling the graphics driver to do it, and so a lot of your assembly is just going to be arranging all of your data according to your platform’s C calling convention and then making syscalls, plus other tedious-but-essential requirements like making sure the stack is aligned whenever you make a jump. You might as well write macros to do that since you’ll be doing it a lot, and if you’ve written macros to do it then you might as well be using C instead, since most of C’s keywords and syntax map very closely to the ASM that would be generated by macros.

    A shame - you do learn a lot by having to tell the computer exactly what you want it to do - but I couldn’t recommend it for any non-trivial task any more. Maybe a wee bit of assembly here-and-there when you’ve some very specific data alignment or timing-sensitive requirement.



  • My workplace is a strictly BitBucket shop, was interested in expanding my skillset a little, experiment with different workflows. Was using it as a fancy ‘todo’ list - you can raise tickets in various categories - to remind myself what I was wanting to do next in the game I was writing. It’s a bit easier to compare diffs and things in a browser when you’ve been working on several machines in different libraries than it is in the CLI.

    Short answer: bit of timesaving and nice-to-haves, but nothing that you can’t do with the command line and ssh. But it’s free, so there’s no downside.


  • Ah, nice. Had been experimenting with using my Raspberry Pi 3B as my home Git server for all my personal projects - easy sync between my laptop and desktop, and another backup for the the stuff that I’d been working on.

    Tried running Gitea on it to start with, but it’s a bit too heavy for a device like that. Forgejo runs perfectly, and has almost exactly the same, “very Github inspired” interface. Time to run some updates…



  • Most common example would be a bicycle, I think - your pedals tighten on “in the same direction the wheel turns” as you look at them. So your left pedal has left-hand thread, and goes on and comes off backwards.

    The effect of precession also means that you can tighten the pedals on finger tight and a good long ride will make them absolutely solid - need to bounce up and down on a spanner to loosen them.



  • We’ve found it to be the “least bad option” for DnD. Have a Discord window open for everyone to video chat in, have a browser window open with Owlbear Rodeo or Foundry / Forge for your tokens and character sheets, all works smoothly enough. The text chat is sufficient for sending the DM a private message; for group chat to share art of the things you’ve just run into or organise the next session.

    Completely agree that for anything “less transient”, then the UX is beyond awful and trying to find anything historical is a massive PITA.




  • Yeah, it’s always had really strong art direction - still holds up, and you don’t notice missing shadows so much in the middle of a frenetic sequence anyway.

    Good to see ray tracing coming along. You could get the same shadows and lighting in a modern rasterising engine now as demonstrated in the RTX version, but at the cost of much more development time. Graphics like that being available to smaller studios and larger games being feasible for bigger studios would be great. HL2 is massive compared to modern shooters, and not having to spend forever tweaking each scene helps with that.


  • When I was still dual-booting Windows and Linux, I found that “raw disk” mode virtual machines worked wonders. I used VirtualBox, so you’d want a guide somewhat like this: https://superuser.com/questions/495025/use-physical-harddisk-in-virtual-box - other VM solutions are available, which don’t require you to accept an agreement with Oracle.

    Essentially, rather than setting aside a file on disk as your VM’s disk, you can set aside a whole existing disk. That can be a disk that already has Windows installed on it, it doesn’t erase what you have. Then you can start Windows in a VM and let it do its updates - since it can’t see the bootloader from within the VM, it can’t fuck it up. You can run any software that doesn’t have particularly high graphics requirement, too.

    I was also able to just “restart in Windows” if I wanted full performance for a game or something like that, but since Linux has gotten very good indeed at running games, that became less and less necessary until one day I just erased my Windows partition to recover the space.


  • It’s a simple alphabet for computing because most of the early developers of computing developed using it and therefore it’s supported everywhere. If the Vikings had developed early computers then we could use the 24 futhark runes, wouldn’t have upper and lower case to worry about, and you wouldn’t need to render curves in fonts because it’s all straight lines.

    But yeah, agreed. Very widely spoken. But don’t translate programming languages automatically; VBA does that for keywords and it’s an utter nightmare.


  • If you move past the ‘brute force’ method of solving into the ‘constraints’ level, it’s fairly easy to check whether there are multiple possible valid solutions. Using a programming language with a good sets implementation (Python!) makes this easy - for each cell, generate a set of all the values that could possibly go there. If there’s only one, fill it in and remove that value from all the sets in the same row/column/block. If there’s no cells left that only take a unique value, choose the cell with the fewest possibilities and evaluate all of them, recursively. Even a fairly dumb implementation will do the whole problem space in milliseconds. This is a very easy problem to parallelize, too, but it’s hardly worth it for 9x9 sodokus - maybe if you’re generating 16x16 or 25x25 ‘alphabet’ puzzles, but you’ll quickly generate problems beyond the ability of humans to solve.

    The method in the article for generating ‘difficult’ puzzles seems mighty inefficient to me - generate a valid solution, and then randomly remove numbers until the puzzle is no longer ‘unique’. That’s a very calculation-heavy way of doing it, need to evaluate the whole puzzle at every step. It must be the case that a ‘unique’ sodoku has at least 8 unique numbers in the starting puzzle, because otherwise there will be at least two solutions, with the missing numbers swapped over. Preferring to remove numbers equal to values that you’ve already removed ought to get you to a hard puzzle faster?