• 81 Posts
  • 54 Comments
Joined 1 year ago
cake
Cake day: July 12th, 2023

help-circle










  • EDIT: Tried nice -n +19, still lags my other programs.

    yea, this is wrong way of doing things. You should have better results with CPU-pinning. Increasing priority for YOUR threads that interact all the time with disk io, memory caches and display IO is the wrong end of the stick. You still need to display compilation progress, warnings, access IO.

    There’s no way of knowing why your system is so slow without profiling it first. Taking any advice from here or elsewhere without telling us first what your machine is doing is missing the point. You need to find out what the problem is and report it at the source.
















  • Just not in Java…

    I think you’re biased against Java. Amazon was started in C/C++ and Java J2EE during times when to configure a webserver required writing like 300 lines of XML just to handle cookies, browser cache and a login page. Until recently BMW had their own JRE implementation. It’s not a secret that simcards, including these in Tesla cars run JavaCard too, even government issues sim cards in EU have to run Java Card, not C++. Everything was always fine with Java until ECMA Script appeared and made people iterate on software versions faster. New programming languages and team organisation methodologies left some programming languages in the dark, but this included C# too. All are quickly catching up. If Java was so bad, it wouldn’t be here with us today, like Perl.






  • So while I’m myself struggling to fully understand what this is, it conceptually like it’s a blockchain on syncthing, where even if you subscribe to a read only share, you can locally delete what you don’t want to keep. So technically you could make bitorrent to behave like syncthing with search function for contacts you already know.




  • Big O notation is useless for smaller sets of data. Sometimes it’s worse than useless, it’s misguiding.

    I don’t agree that it’s useless or misguiding. The smaller dataset, the less important it is, but it makes massive difference how the rest of the algorithm will be working and changing context around it.

    Let’s say that you need to sort 64 ints, in a code that starts our operating system. You need to sort it once per boot, and you boot less frequently than once per day, in fact you know instances of the OS that have 14 years of uptime, so it doesn’t matter at all right? Welp. Now your OS is used by a big cloud provider and they use that code to boot the kernel 13 billions times per day. The context changed, time passed by, your silly bubble sort that doesn’t matter on small numbers is still there.