

Zero chance this company replaced him with an AI that actually does anything useful.
Zero chance this company replaced him with an AI that actually does anything useful.
I was a pretty experienced programmer when I first read SICP, but I still found it incredibly valuable. I’d recommend it to anyone.
It’s such a good idea. I can’t believe they didn’t think of it sooner.
This is what I came here to say. This is a sovereignty issue they could solve with a miniscule portion of their defense budgets.
I use Orgzly, but that’s mostly because I sync it with Emacs on other devices. I tend to organise things in a tree, but it’s quite flexible.
I just started using finamp a couple of weeks ago and this inspired me to install the beta.
If I find any problems I’ll try to get involved on the repository. Discord is a bit of a turnoff though.
I use gnome-session-inhibit
quite a bit, but it’s hard to imagine a good way to automate it.
Sometimes I inhibit idle
to keep something on screen, and sometimes I just inhibit suspend
so something can complete.
It probably doesn’t make sense for the terminal to have anything more than a protocol to control it. The only real benefit to that would be in remote sessions, and it’s not really clear how it should work when multiple machines are involved.
I keep seeing this sentiment, but in order to run the model on a high end consumer GPU, doesn’t it have to be reduced to like 1-2% of the size of the official one?
Edit: I just did a tiny bit of reading and I guess model size is a lot more complicated than I thought. I don’t have a good sense of how much it’s being reduced in quality to run locally.
It sounds like you’d benefit from having a project in mind. I always learned programming languages by building something I wanted, or by tinkering on someone else’s project.
I assume they’re going to use it to waste stupid amounts of electricity?
Why would there be one answer to this? I’d probably use all the available levels depending on the situation, in the same way I’d use --word-diff
or -b
in git
when I need help understanding a complex change.
Have you checked all the ethernet links are actually connected at 1G and not 100M?
Hi, This is a high priority ticket and the FFmpeg version is currently used in a highly visible product in Microsoft. We have customers experience issues with Caption during Teams Live Event. Please help,
Use -data_field first as decoder option in CLI. Default value was changed from first to auto in latest FFmpeg version. Or modify AVOption of same name in API for this decoder.
Thanks @Elon for the reply, This is the command we are currently using: ffmpeg.exe -f lavfi -i movie=flvdecoder_input223.flv[out+subcc] -y -map 0:1 ./output_p.srt
I will be looking to see any updates in the FFmpeg documentation. Can you please elaborate and provide pointers the right decoding options or the right FF command er can use. Thank you!
ffmpeg.exe -data_field first -f lavfi -i movie=flvdecoder_input223.flv[out+subcc] -y -map 0:1 ./output_p.srt
Got that’s fucking brutal. This isn’t even asking them to fix a bug, it’s just basic help-desk shit.
I’m sure Microsoft has some good devs that are a net benefit to the open source projects they use, but this is not one of them.
I feel like node’s async model makes it really easy to cause a bug like this, and really difficult to track it down.
It was left to the OS to catch the leak, because the program was written in such a way that it was able to run a gazillion of these tasks concurrently.
http://freenginx.org/pipermail/nginx/2024-February/000007.html
The most recent “security advisory” was released despite the fact that the particular bug in the experimental HTTP/3 code is expected to be fixed as a normal bug as per the existing security policy, and all the developers, including me, agree on this.
And, while the particular action isn’t exactly very bad, the approach in general is quite problematic.
I read something about this the other day, but I’m having trouble wrapping my head around it.
https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-24989 https://my.f5.com/manage/s/article/K000138444 https://mailman.nginx.org/pipermail/nginx-announce/2024/NW6MNW34VZ6HDIHH5YFBIJYZJN7FGNAV.html
This seems to have the best discussion I’ve found:
Why put in extra hours? That’s not high-performance, it’s just doing more than one job, assuming you’re paid for a target number of hours.
I use Emacs+org-mode on pc and orgzly on mobile. Syncthing to sync them.
Huh, I’ve seen .local used for this quite a bit and only just now realised that it’s meant for something else.
I’ve also seen .corp 🤮
I use orgzly for android, with syncthing to synchronise the files.
https://f-droid.org/packages/com.orgzly/
It’s very flexible, but I’m not sure it’s quite what you’re looking for.
That’s crazy.
Since GPUs got into the TFLOP range I often think of this old magazine cover:
https://images.computerhistory.org/revonline/images/500004286-03-01.jpg?w=600