Fushuan [he/him]

Huh?

  • 1 Post
  • 313 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle
  • Yeah elden ring SotA, I finished all the bosses and all the content but 1 optional boss on saturday, and the last boss yesterday. Good dlc but the balancing is kinda whacky and is has the typical complainers about difficulty, the typical defenders that have not finished the dlc, and then people that have done all bosses that know that the last boss is the most overtuned piece of shit to be ever crafted in ER. The other optional hard boss (bottom right) is also kinda stupid but it’s optional and it does give you SOME breathing room, but the last one is just completely un enjoyable.






  • Shared poibters are used while multithreading, imagine that you have a process controller that starts and manages several threads which then run their own processes.

    Some workflows might demand that an object is instantiated from the controller and then shared with one or several processes, or one of the processes might create the object and then send it back via callback, which then might get sent to several other processes.

    If you do this with a race pointer, you might end in in a race condition of when to free that pointer and you will end up creating some sort of controller or wrapper around the pointer to manage which process is us8ng the object and when is time to free it. That’s a shared pointer, they made the wrapper for you. It manages an internal counter for every instance of the pointer and when that instance goes out of scope the counter goes down, when it reaches zero it gets deleted.

    A unique pointer is for when, for whatever reason, you want processes to have exclusive access to the object. You might be interested in having the security that only a single process is interacting with the object because it doesn’t process well being manipulated from several processes at once. With a raw pointer you would need to code a wrapper that ensures ownership of the pointer and ways to transfer it so that you know which process has access to it at every moment.

    In the example project I mentioned we used both shared and unique pointers, and that was in the first year of the job where I worked with c++. How was your job for you not to see the point of smart pointers after 7 years? All single threaded programs? Maybe you use some framework that makes the abstractions for you like Qt?

    I hope these examples and explanations helped you see valid use cases.


  • It’s not really about the hardware, is it? The option you mentioned won’t enable an alternative app store, it won’t enable access to android app emulators (which would be a huge boom in the open source app offering). The level of trust iPhone users give to appeal is wildly higher that what android users that tweak their phones give the manufacturers. It is what it is, but don’t delude yourself in thinking that it’s about what they do in the kernel level, it’s about the fact that they store tons of sensitive data in their american servers and that they have an obligation to share that data with the country, and as someone from Europe that doesn’t sit well with me.









  • but… this is not the math you see at STEM, this is the math you see at high school at best. There’s no deeper meaning in actual STEM math problems, they are way too abstract or specific. There’s no watermelons, it’s just some a, b, n1, nk… maybe some physics formulas that apply to velocity, mass… I read 0 problems in my uni math and physics courses where they used real world examples.

    I see your point but that’s for high schoolers, not STEM students or alumnus.


  • oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

    I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

    I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.