That would give politicians another reason to raise the retirement age, in order to stay in power.
That would give politicians another reason to raise the retirement age, in order to stay in power.
I’m not sure I’d trust modern CA to do Med3 justice. The new style of Total War is just a different beast from the sublime RTW/Med2 era.
Lots of little things changed, and it just ‘hits different’. Probably the biggest difference is just that every single fight after the first 20 turns will be a 20 stack vs a 20 stack, and every single battle is life or death for that army. It makes the campaign much faster paced - declare war, wipe stack, capture cities for 3 turns until the AI magics up another 20 stack.
In the original Med2, since there wasn’t automatic replenishment, there were often battles between smaller stacks, even in late game, as they were sent from the backline to reinforce the large armies on the front. Led to some of my greatest memories trying to keep some random crossbowmen and cavalry alive against some ambushing enemy infantry they wandered into. The need for manual reinforcement led to natural pauses in wars and gave the losing side a chance to regroup without relying on the insane AI bonuses of the modern TW games - and I do mean insane; they’ll have multiple full stacks supplied from a single settlement.
Most OLEDs today ship with logo detection and will dampen the brightness on static elements automatically.
While it isn’t a silver bullet, it does help reduce burn in since it is strongly linked to heat, and therefore to the pixel brightness. New blue PHOLEDs are expected to also cut burn in risk. Remember that LCDs also used to have burn in issues, as did CRTs.
I’ve been using Nvidia under Linux for the last 3 years and it has been massive pita.
Getting CUDA to work consistently is a feat, and one that must be repeated for most driver updates.
Wayland support is still shoddy.
Hardware acceleration on the web (at least with Firefox) is very inconsistent.
It is very much a second-class experience compared to Windows, and it shouldn’t be.
Linux and Nvidia really need to sort out their shit so I can fully dump windows.
Luckily the AI hype is good for something in this regard, since running gpus on Linux servers is suddenly much more important.
One nitpick, Jesus was almost certainly a real figure. There are many records indicating someone with that name was in the area at the time, and that they were executed by crucifixion.
The religious stuff, obviously no way to prove. But as a person, the historical consensus is they existed.
Humans are intelligent animals, but humans are not only intelligent animals. We do not make decisions and choose which beliefs to hold based solely on sober analysis of facts.
That doesn’t change the general point that a model given the vast corpus of human knowledge will prefer the most oft-repeated bits to the true bits, whereas we humans have muddled our way through to some modicum of understanding of the world around us by not doing that.
But the most current information does not mean it is the most correct information.
I could publish 100 papers on Arxiv claiming the Earth is, in fact, a cube - but that doesn’t make it true even though it is more recent than the sphere claims.
Some mechanism must decide what is true and send that information to train the model - that act of deciding is where the actual intelligence in this process lives. Today that decision is made by humans, they curate the datasets used to train the model.
There’s no intelligence in these current models.
Victoria 3 was just boring - I say this as a huge fan of Victoria 2.
I played a few weeks after launch, and - for every one of the 4 countries I tried (Russia, Japan, Denmark, Spain), simply building all the things everywhere and ignoring money made everything trivial.
The economic simulation was super barebones, the entire thing could be bootstrapped just by building. An entire population of illiterate farmers would become master architects overnight and send GDP to the double digit billions in a few decades.
A token is not a concept. A token is a word or word fragment that occured often in free text and was assigned a number. Common words, prefixes, and suffixes are the vast majority of tokens, and the rest are uncommon pairs of letters.
The algorithm to generate tokens is essentially compression, there is no semantic meaning embedded in them.
Copilot is GPT under the hood, it just starts with a search step that finds (hopefully) relevant content and then passes that to GPT for summarization.
The Dark Souls 2 DLCs are some of the best content in all of Souls. While the original game has some level design issues, the DLCs are sublime.
With refresh rates like that, you must be talking about LED billboards.
These are different from consumer monitors, which mostly use constant LED backlights and a liquid crystal layer to determine color.
An LED bilboard is going to have a fuckton of singular LEDs - each of which can emit exactly one color - arranged in groups to form full pixels capable of displaying many colors. There is no extra LCD layer between your eyes and the billboard LEDs.
The reason for the high refresh rates is because each led must be extinguished and and relit to redraw the image, and the eye is very good at picking up this strobe effect.
The difference vs. a consumer display is that the backlight in a typical monitor is constant. Refreshes the screen involves sending updated instructions to the LCD layer, twisting the crystals and possibly changing the color they allow through.
To make a crude concrete example:
Imagine I am shining a white flashlight in your face. In front of the flashlight I put a colored piece of plastic so the light hitting you is colored. Then I change the plastic to one with a (slightly) different color. I do this 120 times per second. That is a typical consumer display.
Now imagine I am shining a colored flashlight directly in your face. Then I turn it off and grab a flashlight of a different color and shine it in your face. Imagine I do that 120 times per second. That is an LED billboard.
Which do you think is more likely to give you a headache?
One final complication - the brightness of the LEDs is variable over time, they received a modulated signal rather than a steady voltage, so at lower refresh rates there will be a noticeable ripple across the image, similar to how early CRT screens could look.
Increasing the refresh rate hides a lot of these problems.
Every billion parameters needs about 2 GB of VRAM - if using bfloat16 representation. 16 bits per parameter, 8 bits per byte -> 2 bytes per parameter.
1 billion parameters ~ 2 Billion bytes ~ 2 GB.
From the name, this model has 72 Billion parameters, so ~144 GB of VRAM
Check it out to throw in the trash. Jared Diamond’s book is thoroughly condemned in anthropological and archaeological circles.
it’s not spontaneous
Spontaneity in thermodynamics refers to a process which occurs without external application of energy. In your description, a pile of ash becoming an apple is spontaneous.
So in a contained universe, it doesn’t matter if it’s an apple releasing energy and becoming a pile of ash, or a pile of ash absorbing energy and becoming a perfectly normal apple.
The net energy is still conserved. Just going from energy to mass unlike mass to energy.
There is no mass-energy conversion in an apple burning to become ash, just the release of chemical energy from newly-formed bonds.
Regardless, conservation of energy is only one part of how the universe operates. The second operating principle is (or at least from hundreds of years of scientific inquiry appears to be) the maximization of entropy. That is the ‘spreading out’ of available energy. This is the reason iron rusts, rather than remaining oxygen and iron - conservation of energy alone cannot explain natural phenomena.
Spontaneous reconstruction of an ashed apple violates the second law of thermodynamics, and the Second law is no less valid than the First.
Lastly, I was not writing specifically about Penrose’s views on consciousness. His entire theory that gravity is driving the collapse of a wave function, and that said collapse occurs retroactively, is untested and based on an appeal to elegance. This does not make it wrong, but it most certainly should not be taken as true.
Beyond consciousness, the second law of thermodynamics also implies the presence and direction of time. In fact, it is sometimes called the Arrow of Time as it appears to direct physical processes to happen preferentially in the direction that increases entropy.
A self contained universe with fixed energy and infite time will eventually see a pile of ash turned into an apple. And it wouldn’t violate a damn thing with our system of physics.
This occuring spontaneously would indeed violate the 2nd law. This is a core disagreement between classical thermodynamics and statistical mechanics, which seems to re-derive classical thermo from probabilistic arguments over system states.
I feel it also warrants stating that Penrose’s theory is not widely accepted, has yet to be tested, and is based mostly on an argument to elegance - it “seems weird” for their to be uncountably infinite parallel timelines spawning at every instant. It is far too soon for it to be taken as fact.
Role of thumb is an employee costs roughly twice their base salary, as the employee still needs to cover insurance, taxes, sick time, and other benefits.
That leaves an average salary of 190K for the 50 employees. That isn’t much for tech.
Explaining what happens in a neural net is trivial. All they do is approximate (generally) nonlinear functions with a long series of multiplications and some rectification operations.
That isn’t the hard part, you can track all of the math at each step.
The hard part is stating a simple explanation for the semantic meaning of each operation.
When a human solves a problem, we like to think that it occurs in discrete steps with simple goals: “First I will draw a diagram and put in the known information, then I will write the governing equations, then simplify them for the physics of the problem”, and so on.
Neural nets don’t appear to solve problems that way, each atomic operation does not have that semantic meaning. That is the root of all the reporting about how they are such ‘black boxes’ and researchers ‘don’t understand’ how they work.
There are really only 3 search providers, Google, Bing, and Yandex.
All others will pay one of these three to use their indexes, since creating and maintaining that index is incredibly expensive.