Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)S
Posts
6
Comments
552
Joined
8 mo. ago

  • You also can understand everything in a system, at least some people can. I understand those people are rare and expensive to hire.

    No. No, you seriously can't, not even if you are deploying to one single PC. Your code includes libraries and frameworks. With some studying, you might be able to familiarize yourself to the point where you know every single flow through the frameworks and libraries down to each line that's being executed. Then it goes through the compiler. Compiler building is an art unto itself. Maybe there are a handful of people who understand everything GCC does in the roughly 200MB of its source code. But lets say you are a super crack programmer, who can memorize source code with about as many characters as 42x all of the Harry Potter books.

    Now your code gets executed by the OS. If you are on Windows: Sucks to be you, because it's all closed source. All you can manage to understand is the documentation, unless you decompile all of Windows. If you are on Linux you at least have the source code. That's only 300MB of source code, shouldn't be hard to completely understand everything in there and keep it in memory, right? And you aren't running your code directly on the bare Linux kernel, so please memorize everything your DE and other relevant components do.

    But we aren't done yet, we are just through the software part. Hardware is important too, since it might or might not implement everything exactly like in the documentation. So break out your hex editor and reverse-engineer the latest microcode update, to figure out how your CPU translates your x64 calls to whatever architecture your CPU uses internally. An architecture that, btw, doesn't have any public documentation at all. Might be time to break out the old electron microscope and figure out what the 20 billion transistors are doing on your CPU.

    Now we are done, right? Wrong. The CPU is only one component in your system. Now figure out how all other components work. Did you know that both your GPU and your network interface controller are running full embedded operating systems inside them? None of that is publicly documented or open source, so back to the electron microscope and reading binary code in encrypted update files.

    If you think all this knowledge fits into a single human's brain in a way that this human actually knows what all of these components do in any given circumstance, then I don't really know what to say here.

    It's not a matter of skill. It's just plain impossible. It is likely easier to memorize every book ever written.

    One thing C really lacks is modern libraries to do these things. It’s not a limitation of C itself it’s just that most modern tools are targeted towards other languages. I understand that writing webapps in C isn’t the best idea because you don’t want web stuff running on hardware directly most of the time if you care about security anyways, but it’s really just a trend where the industry moved away from C with all of its frameworks and stuff which has not been good for the users.

    You can write webapps in C using Webassembly. Nobody does it because it takes much more time and has basically no upsides.

    Windows 98 was really good if you knew how it worked. I never had any issues really with stuff like XP. It always worked, it was always fast, it was always stable. I used XP for probably 10 years and never had any issues with instability and stuff and I was constantly modifying stuff, overclocking, patching drivers, modding bios, doing weird stuff that others didn’t do coming up with my own solutions. It worked really well. It’s modern windows that’s a buggy mess that crashes all the time.

    I would recommend that you revisit these old OSes if you think that. Fire it up in a VM and use it for a few weeks or so. Nostalgia is a hell of a drug. I did run Win98 for a while to emulate games, and believe me, your memory doesn't reflect reality.


    Reading what you are writing about programming, may I ask about your experience? It sounds to me like you dabbled in a bit of hobby coding a while ago, is that right?

    Because your assessments don't really make much sense otherwise.

    To get back to the other point though, to move away from C was a mistake. It’s not that much more complicated than using other languages. Most of the complexity was just in setting up the environment which was admittedly terrible under C. Trying to link libraries and stuff. The actual code itself is not really that much more difficult than say python, but it’s a different paradigm.

    No, the problem was not setting up the environment. The main problem with C is that it doesn't do memory management for you and thus you constantly have to deal with stuff like buffer overflows, memory management issues, memory leaks, pointer overflows and so on. If you try to write past a buffer in any modern language, either the compiler or the runtime will catch it and throw an error. You cannot write past e.g. the length of an array Java, Python or any other higher-level language like that. C/C++ will happily let you write straight across the stack or heap, no questions asked. This leads to C programs being incredibly vulnerable to fitting attacks or instabilities. That's the main issue with C/C++.

    and it’s not automatic that your code is going to be cross platform unless you use platform agnostic libraries. It’s entirely possible to write multiplatform code in C and most programs could be written in a multiplatform way if users use libraries that target multiplatform development and let users compile them ahead of time.

    C is just as much "inherently multiplatform" as Python: Use pure C/Python without dependencies and your code is perfectly multi-platform. Include platform-specific dependencies and you are tied to a platform that supplies these dependencies. Simple as that. Same thing for every other language that isn't specifically tied to a platform.

    You could even have a standard in CPUs that would run any code to bootstrap a compiler and you could have platform agnostic binaries, which is just something that never happened because there was not really a point to it since so much code was written in lockdown .net and directx.

    That standard exists, it's called LLVM, and there are alternatives to that too. And there are enough platform agnostic binaries and stuff, but if you want to do platform-specific things (e.g. use a GPU or networking or threads or anything hardware- or OS-dependant) you need to do platform-specific stuff.

    Python is a scripting language. It’s best used to call C libraries or to write very lightweight apps that don’t depend on low level hardware access. Java is like C but worse. JavaScript is like the worst of all worlds, strongly typed, verbose, picky about syntax, slow, interpreted, insecure, bloated, but it is cross platform which was originally probably why it was so popular. That should have just been added to C however. When you have code that runs 10x-10,000 times slower and you have bad programmers who don’t know how to write code that doesn’t destroy the bus, or use 100% of your system resources for no benefit, you end up in this mess we have today, for every app that uses 100% of your memory bandwidth, that halves the speed of the next program. If you have 3 programs running that peg then Emory bus, that means your next program is going to run at 0.25 the speed roughly. This is not how software should be written.

    I don't even know what kind of bus you are talking about. Emory bus is a bus line in Atlanta.

    If you are talking about the PCIe bus, no worries, your python code is not hogging the PCIe bus or any other bus for that matter. It's hard to even reply to this paragraph, since pretty much no single statement in there is based in fact.

    The cool thing about C is you can use it like basic if you really want. With a bit more syntax, but you don’t have to use it with classes. You can just allocate memory on stack and heap and then delete all of it with like one class if you really want to. Everything that’s cool about other languages mostly just already exists in C.

    You cannot use C with classes. That's C++. C doesn't have classes.

    It’s kind of amazing to see the difference between a Linux smartphone and an android smartphone these days. A Linux smartphone running terrible hardware by today’s standard is just instant. 32 GBs of storage is enough to add everything you want to the operating systems because binaries are like 2 MB. Then that all goes away as soon as you open a web browser. A single website just kills it.

    Hmm, nope. Linux smartphones run fast because they have no apps. Do a factory reset on your Android phone and disable all pre-installed apps. No matter what phone it is, it will run perfectly fast.

    But if you run tons of apps with background processes, it will take performance.

    Then you sit down on a modern windows machine and everything is slow and buggy as shit. It draws 500w of power on a 2nm process node. It’s a real issue. No amount of computer power will ever overcome interpreted languages because people will always do the minimum possible work to get it to run at an unstable 30 FPS and call it good.

    I use Linux as my main OS, but I have Windows as a dual-boot system for rare cases. My PC draws 5w in idle on Windows or on Linux. The 500w what your PSU is rated for, or maybe what the PC can draw in full load with the GPU running at full speed (e.g. if you play a photo-realistic game), not what is used when the PC idles or just has a few hundred tabs in the browser open.

  • Have you heard of the term "Software crisis"?

    We don't really talk all that much about it any more, because it's become so normal, but the software crisis was the point where computers became faster than human programmers. That problem came up in the 1960.

    Up until then a computer was simple enough that a single human being could actually understand everything that happened under the hood and could write near-optimal code by hand.

    Since then computers doubled in performance and memory every few years, while developers have largely stayed human with human performance. It's impossible for a single human being to understand everything that happens inside a computer.

    That's why ever since we have tried optimizing for developer time over execution time.

    We have been using higher-level languages, frameworks, middlewares and so on to cut the time it takes to develop stuff.

    I mean, sure, we could develop like in the 90s, the tools are all still there, but neither management nor customers would accept that, for multiple reasons:

    • Everyone wants flashy, pretty, animated things. That takes an ungodly amount of performance and work. Nobody is ok with just simple flat unanimated stuff, let alone text-based tools.
    • Stuff needs to run on all sorts of devices: ARM smartphones, ARM/x86 tablets, ARM/x86 PCs, all supporting various versions of Windows, Mac, Android, iOS and preferrably also Linux. But we also need a web version, at best running on Chrome, Firefox and Safari. You could develop all of these apps natively, but then you'd need roughly 20 apps, all of them developed natively by dedicated experts. Or you develop the app on top of browsers/electron and have a single app.
    • Stuff needs to work. Win95-level garbage software is not ok any more. If you remember Win95/98 fondly, I urge you to boot it up in a virtual machine some time. That shit is unacceptably buggy. Every time the OS crashes (which happens all the time) you are playing russian roulette with your partition.
    • Did I mention that everything needs to be free? Nobody wants to pay for software any more. Win95 was $199 ($432 in 2025 money) and Office was $499 ($1061 in 2025 money). Would you pay 1.5k just for Win11 and the current office?

    So there's more and more to do with less and less time and money.

    We can square that circle by either reducing software quality into nothingness, or by using higher-level developer tools, that allow for faster and less error-prone develoment while utilizing the performance that still grows exponentially.

    What would you choose?


    But ultimately, it's still the customer's choice. You don't have to use VSCode (which runs on Electron). You can still use KATE. You don't have to use Windows or Gnome or MacOS. You can use Linux and run something like IceWM on it. You don't have to use the newest MS Office, you can use Office 2013 or Libre Office.

    For pretty much any Electron app out there, there's a native alternative.

    But it's likely you don't use them. Why is that? Do you actually prefer the flashy, pretty, newer alternative, that looks and feels better?

    And maybe question why it feels so hard to pay €5 for a mobile app, and why you choose the free option over the paid one.

  • Python is a millenial. He's 34, is married and has two kids. But the old guys still think he's 15.

  • In your analogy, only C and C# mention using specific tools, unless you count mushrooms as tools ;)

  • The AND and OR also have slightly different meanings than in "real life". That's what always happens if you use natural language terms in the context of formal languages.

    In English (and many other natural languages), "or" means either XOR or OR, and "and" can also mean OR in some contexts.

    So it just follows that IMPLIES doesn't mean the exact same as "implies".


    To take your example with the ravens: IMPLIES is just about whether the inputs contradict the statement. The statement "raven IMPLIES black" should be translated to "raven is black", not "all ravens are black", since you only ever test the statement against specific inputs.

    So if you now give it "raven" and "black" as inputs, the statement is true. If you give it "dove" and "white" as inputs, it doesn't contradict the statement, so the result is still true. Only if you give it "raven" and "white" as inputs does it contradict the statement.

    Remember: Boolean logic isn't about statements of truth, but instead as a mathematical operation. Same as you use mathematical functions and operators to "calculate x", you also use boolean operations to calculate a result.

  • It comes from boolean logic, not from set theory (which is what it's used for here).

    A implies B means that "If A is true, B is also true", so there's the implication. "If the thing we are talking about is a desk, it is also a piece of furniture".

    The implication has no effect if A is not true: "If the thing we are talking about is not a desk, that doesn't say anything about whether it's furniture or not".

    So the implication is only false if A is true and B is false. In any other case the implication is satisfied and the result is true.


    Set theory just took over all of the boolean logic functions, since they work exactly the same in set theory, so they also took over the naming even if the name "implies" doesn't really make much sense in set theory.

  • Seriously, get help. Detachment like that usually means something is quite severely wrong. Could be depression, could be something worse, but regardless, this is not a healthy state to be in.

  • C is the old carpenter with leaky memory with heavy undiagnosed autism, who constantly cracks demented jokes like "Missing } at end of file".

    He's so mentally not there in fact, that if you don't specifically tell him to return to you after finishing the job, he will neither figure out what he's supposed to do, nor will he tell you what went wrong, but instead he will happily jump somewhere else, halucinate commands from the structure of the walls and start doing whatever the voices tell him to do.

  • Before the arbitrary Windows 11 hardware restrictions, this was exactly what was happening on the Windows side as well. There are still tons of 10-15yo Windows devices around, happily running Win10.

    "Regular" people also only upgrade their PC once the old one breaks or if they really encounter something that doesn't work on the old PC (mostly games if they do play somewhat modern games).

    In fact, Windows used to have really awesome long-term-support and forever long upgrade support. You can easily run Win10 on a quality high-performance PC from 2008. But with Win11, they just tossed all that in the drain.

  • Just become homeless and carless. Then you won't have either keys.

  • I did flash animation. I am a developer (I prefer backend but we all have to do some web). I was an adult during that time.

    This is what you said a bit above.

    This is what you are saying now:

    I’ll tell you what I did say: I said I was an (amateur) flash animator starting in middleschool, and I was a developer (not flash) by the time flash was dying (which IMO is the early to mid 10s).

    These two statements contradict. Were you an adult at the time or in middle school?

    If you say you weren't a flash developer, then why include that statement above, except to give yourself an air of fake authority as you did in the whole rest of the comment above where you tried to tell me that you are a great big adult, super old, and I am a child.

    If you became an adult in the mid 10s, then you are younger than me, kiddo.

    but for you to say that prior to ~2015 it was easier to make 2D animation using unity or javascript+canvas/SVG than it was to make 2D animation in flash, then that is just crazy. its just ignorant.

    There were plenty of tools to make 2D animation even before 2012. You just know your one single little tool and that's all you know. Congratulations.

    Heck, yes, even HTML+JS was for making 2D animations in 2010, because that was what I did back then. It was super easy, and as long as you didn't target IE with Vanilla JS, it was really really easy.

    Also:

    by the time flash was dying (which IMO is the early to mid 10s).

    but for you to say that prior to ~2015 it was easier to make 2D animation using unity or javascript+canvas/SVG than it was to make 2D animation in flash, then that is just crazy. its just ignorant.

    So how much time is between "mid 10s" and 2015?

  • how many of those are animated video. holy willful misinterpretation.

    Definitely much, much more than anything that was ever made in flash. Also, flash animation started because there was nothing else that could fit into a size that was affordable enough to host and small enough to download over low bandwidth internet. Two constraints that don't matter any more.

    this whole thread is lamenting the fall of interactive animation. you cant hide a funny mouse-over easter-eggs in a youtube video, like OP is talking about.

    So now we are back to "needs programming", which you just said in your last comment that it is not what you are talking about. You aren't just moving the goal posts, you are switching them back and forth.

    The file sizes are huge.

    Who cares? It's not the early 2000s any more.

    except that the alternatives that fulfill the wants of OP are way harder to make the equivalent art.

    Again, you seem to think that Unity doesn't exist, same as you claimed above that Unity can only do 3D and you claimed you need to know how to program to use Unity. None of which is true.

    Only because, as you demonstrated in the other thread, you’ve (willfully?) misread what I’ve said. meanwhile you’ve just told people “learn to code or pick up a camera, fuck the art that you actually wanted to make”

    The art that YOU wanted to make. Apart from you and that other dude in this thread I have never actually heard of anyone being sad that Flash is gone.

    And if you really want to make flash stuff, you can use one of the dozens of Flash to HTML5 converters (that you apparently don't know exist) or a Flash on Webassembly implementation, which exist as well.

    You are crying over the loss of something that still exists, and people are not using it because it still sucks.

  • You were the one harping on how you were a great flash "developer".

    That was your argument. Now you are claiming that you weren't a developer. Make up your mind.

    Now you don't even know that Unity totally does 2D as well and that it's easier to use than Flash ever was. So your argument boils down to the fact that you don't know anything about the topic at hand.

    People who are making videos nowadays don't need to program either, they just animate a video and upload it to youtube. No programming needed.

    But it was you who claimed that "being a flash developer" is somehow superior to that.

    In fact, we do see a ton of kids making games e.g. using Unity or other tools. Have you ever heard of Roblox? That whole ecosystem is run by kids making games.

    You are just wildly out of touch and stuck in the past.

    Flash sucked. There's better alternatives today. You are out of touch.

  • You know what, I can give you numbers:

    The table shows how YouTube's video count grew from 0.5 billion in 2015 to 5.1 billion in 2025. The biggest jumps happened in 2023 and 2025, with 800 million new videos each year.

    https://seo.ai/blog/how-many-videos-are-on-youtube

    Die Seite hat nach eigenen Angaben über 1.400.000 registrierte Mitglieder und über 660.000 Einträge (Stand: 7. April 2013).

    "The Site has, according to their own statements, over 1 400 000 registered users and over 660 000 entries (7th of April 2013).

    https://de.wikipedia.org/wiki/Newgrounds

    (Had to resort to German Wikipedia, because their stats site has been down for a long time and wasn't saved in Webarchive.)

    So you see, even at the height of their popularity, they had about 1/1000 of the content of Youtube and compared to now, it's 1/10000. And that's only Youtube, not counting Facebook, Reddit, Tiktok, Instagram and all those other platforms people use to share their content.

    And around 2015, the total number dropped, but didn't have a corresponding increase in non-flash equivalents.

    So yes, there has been a massive, massive increase in non-flash video content, so much of an increase that flash looks like a tiny spec of a niche of internet history.

    In fact, most of the old flash videos have more views on youtube than they ever had in their original forms.

    Eww. that's elitist as fuck. These people aren't software devs. They shouldn't need to learn to code in order to animate a video. For absolute shame. Wow.

    And now you are getting onto something. No need to program when making a video for Youtube.

    Flash was abandoned as fast as possible as soon as newer, easier and better alternatives arrived.

    Those who wanted to code, left for JS. Those who wanted to make videos left for Youtube and the likes. Those who wanted to make games left for Unity and other engines.

    Flash was just outdated, old technology. Nothing else.

    You have been elitist as fuck throughout all your comments in this chain, thinking that you are somehow better than everyone else because you got stuck in some old software and didn't manage to migrate to something better.

    If you aren't a developer, don't claim to be.

  • Tbh, Vector only marginally solves that issue. If it's a filmed video, then it doesn't solve it at all, since it just creates "vector pixels" instead, which don't scale either. So it would only work for artificially created videos, and there it would only work for 2D content, and only 2D content that doesn't use bitmaps in it.

    It's quite a limited subset of the videos one might watch. In fact, I can't remember the last time I watched a 2D purely PC generated video that wasn't a screen recording from some game (which is, almost per definition, also rasterized).

    The other problem there is that vector graphics can be rasterized into however many pixels you want, but the detail from the source material doesn't improve. Yes, the edges around a flat area are smoother, but it's still edges around a flat area.

    Compare the best flash animation you can find with some random demo video on youtube (or if you want to go to the extreme: with the graphics of some hollywood CGI). The infinite scalability of vector graphics won't make the flash animation look better than the raster graphics image.

    The "infinite scalability" of vector graphics are a mostly academic point unless you are e.g. designing a company logo that needs to look sharp both on a tiny stamp and on the side of the corporate headquarter.

  • Ok, tell me: How many people make animated music videos and publish them on Youtube, versus how many people make animated music videos and publish them as Flash videos in 2025?

    How many people did that in 2015 in Youtube vs Flash videos?

    Nobody cares about Flash because it sucks. Even back in 2012 Flash sucked. It was a really bad tech and by 2015 it was mostly used by people to dumb to learn real programming languages and frameworks.

  • The fact that vector works at resolutions (even if they don't exist yet!) without the author even needing to think about it (let alone re-export) is an advantage.

    That's why I was talking about meaningful advantages. Today, stuff gets exported in 4k and that's it. No need for anything more.

    That nobody uses animated SVG should give you a clue about how many people value vector graphics over rasterization. It has uses (mostly when you expect stuff to get zoomed a lot) but only in quite specific use cases.


    There's ton of free software that exports to HTML5, including most major game engines. And people use that a lot. In fact, you can make VR games that fully run in a browser.

    Browser games still exist. They run on HTML5 now, not on Flash. Web video still exists. It runs on HTML5 players, not on Flash. Little animations in websites still exist. They run on HTML5/SVG/CSS, not on Flash. Flash really was just replaced by HTML5, because it's plain better on every front.

  • Dude, I'be been developing HTML apps from 2008 on. Early HTML5 browser support was literally my job at that time.

    You seem to have totally ignored the next gen tech at that time and now you can't remember what happened back then.

    And now you are basing your whole argumentation on "you must be a kid".

    Kiddo, I'm likely pretty much the same age as you.

    You were the one who brought up canvas support. By 2015 you could export full 3D games made in Unity to HTML5. And that was certainly not the first, there were literally dozens of other engines that allowed export to HTML5/WebGL at that time.

    If you are too young to remember, that's not my problem, little child.

    Flash died because people moved to a better, more future-proof stack. And you claiming that little 2D animations in Flash were technically much, much better than full 3D rendering with GPU support is honestly wild.

    (If you want to get offensive because you don't have arguments, fine, I can get offensive too, little child.)

  • A vector video format does exist: animated SVG. It has all the features you claim are missing.

    But nobody uses it because it is much more complicated to do than rasterized video and has no relevant advantages.

    You keep claiming that features don't exist even though every single one of these features do exist but are just not used a lot because they are more complicated and have no relevant benefits.