Skip Navigation

User banner

AlmightySnoo 🐢🇮🇱🇺🇦

@ AlmightySnoo @lemmy.world

Posts
61
Comments
325
Joined
2 yr. ago

Yoko, Shinobu ni, eto... 🤔

עַם יִשְׂרָאֵל חַי Slava Ukraini 🇺🇦 ❤️ 🇮🇱

  • It depends. I'm working in the quant department of a bank and we work on pricing libraries that the traders then use. Since traders often use Excel and expect add-ins, we have a mostly Windows environment. Our head of CI, a huge Windows and Powershell fan, once then decided to add a few servers with Linux (RHEL) on them to have automated Valgrind checks and gcc/clang builds there to continuously test our builds for warnings, undefined behavior (gcc with O3 does catch a few of them) and stuff.

    I thought cool, at least Linux is making it into this department. Then I logged into one of those servers.

    The fucker didn't like the default file system hierarchy and did stuff like /Applications and `/Temp' and is installing programs by manually downloading binaries and extracting them there.

  • Bad track record with their privacy invasion via their Amazon shenanigans (which Richard Stallman called the Ubuntu Spyware), the shilling of Ubuntu One cloud and now Ubuntu Pro subscriptions that are reminiscent of Microsoft's shilling of Microsoft accounts and OneDrive, Snap telemetry...

  • Ubuntu is just Windows in Tux' clothing

  • That's a good way of maximizing technical debt.

  • The case also needs a paint job

  • Noctua gang 🤝

  • Me when someone's Ubuntu install reaches EOL: just install Arch

  • How about just LibreOffice resume templates?

  • That repo is just pure trolling, read the "Improved performance" section and open some source files and you'll understand why.

  • FreeBSD is now obsolete

  • Biased opinion here as I haven't used GNOME since they made the switch to version 3 and I dislike it a lot: the animations are so slow that they demand a good GPU with high vRAM speed to hide that and thus they need to borrow techniques from game/GPU programming to make GNOME more fluid for users with less beefy cards.

  • Double and triple buffering are techniques in GPU rendering (also used in computing, up to double buffering only though as triple buffering is pointless when headless).

    Without them, if you want to do some number crunching on your GPU and have your data on the host ("CPU") memory, then you'd basically transfer a chunk of that data from the host to a buffer on the device (GPU) memory and then run your GPU algorithm on it. There's one big issue here: during the memory transfer, your GPU is idle because you're waiting for the copy to finish, so you're wasting precious GPU compute.

    So GPU programmers came up with a trick to try to reduce or even hide that latency: double buffering. As the name suggests, the idea is to have not just one but two buffers of the same size allocated on your GPU. Let's call them buffer_0 and buffer_1. The idea is that if your algorithm is iterative, and you have a bunch of chunks on your host memory on which you want to apply that same GPU code, then you could for example at the first iteration take a chunk from host memory and send it to buffer_0, then run your GPU code asynchronously on that buffer. While it's running, your CPU has the control back and it can do something else. Here you prepare immediately for the next iteration, you pick another chunk and send it asynchronously to buffer_1. When the previous asynchronous kernel run is finished, you rerun the same kernel but this time on buffer_1, again asynchronously. Then you copy, asynchronously again, another chunk from the host to buffer_0 this time and you keep swapping the buffers like this for the rest of your loop.

    Now some GPU programmers don't want to just compute stuff, they also might want to render stuff on the screen. So what happens when they try to copy from one of those buffers to the screen? It depends, if they copy in a synchronous way, we get the initial latency problem back. If they copy asynchronously, the host->GPU copy and/or the GPU kernel will keep overwriting buffers before they finish rendering on the screen, which will cause tearing.

    So those programmers pushed the double buffering idea a bit further: just add an additional buffer to hide the latency from sending stuff to the screen, and that gives us triple buffering. You can guess how this one will work because it's exactly the same principle.

  • “these bots”: Yeah, you are being an asshole

    I'm pretty sure he didn't mean his colleagues and is rather talking about the UiPath bots, it's an IT automation tool... 🤖

  • If he's a contractor it's unlikely he'll stay there for too long. I'd bring up the improvements and potential gains (faster processing, ideally no more UiPath license costs) directly with your boss. If they're still not open to that then yeah I'd look elsewhere, because even as an IT automation job it just screams laziness.

  • but some of these processes involve going through Excel files which can take these bots 10s of minutes, which can be done instantly in any scripting language

    The key is being proactive. Have you tried suggesting that to them? Do a small POC with say a Python script and show them the difference on one of the Excel files, they're likely to like your alternative. They're likely to have poor data warehousing too and it could be an opportunity for you to shine and at the same time get to learn to do that for them from scratch.

  • “With the new Desktop Cube, you can switch between workspaces in 3D. Your app windows float off the desktop surface with a parallax effect, so you can see behind them,” said the Zorin OS team. “There’s also the new Spatial Window Switcher, which replaces the standard flat Alt+Tab and Super+Tab dialog with a 3D window switcher.”

    Compiz Fusion is an idea and ideas never die

  • Cats would have already pushed everyone off the edge if the earth was really flat

  • funny double meaning to Blaze's comment depending on whether you read it as Social Justice Warrior or Shit Just Works (the instance)

  • I was learning C/C++ back then and although the nostalgia is strong with this one, Turbo C++ was obviously shit (and Borland quickly killed it later anyway), and while looking around for alternatives I found DJGPP which introduced me to the GNU toolchain and so the jump to Linux to have all of that natively instead of running on DOS was very natural. My very first distro was Redhat Linux 6.2 that I got as a free CD along with a magazine (also got a Corel Linux CD the same way that I was excited about given how their WordPerfect was all the rage back then but I was never able to install it, I don't remember what the issue was) and it looked like this (screenshot from https://everythinglinux.org/redhat62/index.html ):

  • Memes @lemmy.ml

    Shitty Situation

  • Memes @lemmy.ml

    Reddit mods who decided John Oliver pics were a viable form of protest

  • Lemmy @lemmy.ml

    KaTeX or MathJax support?

  • Lemmy @lemmy.ml

    Please make captchas a default in 0.18

  • 196 @lemmy.blahaj.zone

    handsome rule

  • Fediverse @lemmy.ml

    Trackers like Lemmyverse.net and Fediverse Explorer are incentivizing instance owners to inflate numbers with bots

  • 196 @lemmy.blahaj.zone

    rule - when you open post-blackout r/aww for the first time

  • Lemmy @lemmy.ml

    idle Lemmy tab using too much CPU resources?