![](https://programming.dev/pictrs/image/7fc56aab-5618-4dfb-abb1-0a35bd0d79df.jpeg)
![](https://lemmy.ml/pictrs/image/2QNz7bkA1V.png)
The problem with assassin the Russian economy, is to do it faster then it commit suicide.
The problem with assassin the Russian economy, is to do it faster then it commit suicide.
Not the latest, but one of the biggest improvements was the Ultimate Hacking Keyboard. Now I have programmed the keyboard to have VIM navigation at the keyboard level. The latest was switching to neovim and setting it up properly.
You are confusing Google and Internet… they are very different things.
Had to test with Kagi also, leads with official documentation, after that tutorials and unofficial things. Nothing obviously irrelevant. The only thing with the Kagi results, was that there were a few very simmilar official documentation links (for different postgresql versions) at top. But, still good search results. Not sure why anyone is still using google, when there are quite a few better alternatives availale
This is obvious for people who understand the basics of LLM. However, people are fooled by how intelligent these LLM sounds, so they mistake it for actually being intelligent. So, even if this is an open door, I still think it’s good someone is kicking it in to make it clear that llms are not generally intelligent.
That’s why it felt very early to have used it before it was default, I mean before 2016 felt too early for me… But it was way before Covid, so I’d say around 2017.
I know I have used it since Fedora made it default in 2016. I think I actually used it a while before that, but I don’t have any thing to help me pin down the exact time.
Since I only use Intel built-in GPU, everything have worked pretty well. The few times I needed to share my screen, I had to logout and login to an X session. However, that was solved a couple of years ago. Now, I just wait for Java to get proper Wayland support, so I fully can ditch X for my daily use and get to take advantage of multi DPI capabilities of Wayland.
I have been a vim user for more than 20 years. I tried to quit for a couple of years, but now I have just accepted my faith.
For Linux it is a huge difference. AMD and Intel have great open source drivers, while Nvidia have binary drivers with a lot of issues.
I’m free to choose any laptop I want for work. This means, that for me, the GPU and other processors are free. It turns out that I still avoid Nvidia like the plague. I don’t care if it is free, if the drivers are horrible.
The hostility towards custom ROM in general, is what forced me to root. Initially I used LineageOs without root. However, that got me in to issues with various apps, due to not passing safety net. So now I use magisk to hide that I use a custom ROM. So, they basically forced me to root.
But is the desktop really the most relevant measurement? Wouldn’t it be more relevant to talk about “primary” devices? When I grew up, the desktop was what people used to connect with Internet and everything that comes with that. Hence, Linux on the desktop seemed to be relevant. Now, that is still relevant in relation to work and gaming, but for general use people use other devices. So instead of “on the desktop” I think we should talk about “for work”, “for gaming” and “for programming”.
That is wezterm which have builtin Nerd Font fallback, and I actually think WezTerm renders it to wide to fit it better with other fonts. But the rest of the font is JetBrains Mono
Yes, that was the first that came to my mind when I saw the TIL post… which also was why I felt the need to see if that rant is still valid, or if modern libraries could handle that.
Yes, they are not very upfront with this requirement, almost like they have understood that people doesn’t like it, but instead of fixing it they just try to hide it from their marketing material. And that doesn’t feel shady at all…
From their documentation
Unlike classic terminals, Warp requires you to sign up and log in to get started with the app.
So, yeah, it might be that people are not very impressed by a terminal that requires a cloud account.
But, if you don’t type anything sensitive on to your terminal, like passwords and such, then you should be fine…
It starts by presenting it self as an Comedy AI, that implies more than deep fake.
This was in 1985, on a ABC80, a Swedish computer with a 3 MHz CPU. So, in theory it would be much faster, but I assume there were many performance losses (slow basic interpretor and thing like that) so that for loop got close enough to a second for us to use.
On my first programming lesson, we were taught that 1 second sleep was for i = 1 to 1000
😀, computers was not that fast back then…
No, it is not based on Gnome. It is a full DE environment written in rust.