Interested in Linux, FOSS, data storage systems, unfucking our society and a bit of gaming.

I help maintain Nixpkgs.

https://github.com/Atemu
https://reddit.com/u/Atemu12 (Probably won’t be active much anymore.)

  • 58 Posts
  • 753 Comments
Joined 4 years ago
cake
Cake day: June 25th, 2020

help-circle



  • It’s a central server (that you could actually self-host publicly if you wanted to) whose purpose it is to facilitate P2P connections between your devices.

    If you were outside your home network and wanted to connect to your server from your laptop, both devices would be connected to the TS server independently. When attempting to send IP packets between the devices, the initiating device (i.e. your laptop) would establish a direct wireguard tunnel to the receiving device. This process is managed by the individual devices while the central TS service merely facilitates communication between the devices for the purpose of establishing this connection.




  • Good luck packaging new stuff

    Packaging is generally hard on any distro.

    Compared to a traditional distro, the packaging difficulty distribution is quite skewed with Nix though as packages that follow common conventions are quite a lot easier to package due to the abstractions Nixpkgs has built for said conventions while some packages are near impossible to package due to the unique constraints Nix (rightfully) enforces.

    good luck creating new options

    Creating options is really simple actually. Had I known you could do that earlier, I would have done so when I was starting out.

    Creating good options APIs is an art to be mastered but you don’t need to do that to get something going.

    good luck cross-compiling

    Have you ever tried cross-compiling on a traditional distro? Cross-compiling using Nixpkgs is quite easy in comparison.

    actually good luck understanding how to configure existing packages

    Yeah, no way to do so other than to read the source.

    It’s usually quite understandable without knowing the exact details though; just look at the function arguments.

    Also beats having no option to configure packages at all. Good luck slightly modifying an Arch package. It has no abstractions for this whatsoever; you have to copy and edit the source. Oh and you need to keep it up to date yourself too.

    Gentoo-like standardised flags would be great and are being worked on.

    good luck getting any kind of PR merged without the say-so of a chosen few

    Hi, one of the “chosen few” here: That’s a security feature.

    Not a particularly good one, mind you, but a security feature nonetheless.

    There’s also now a merge bot now running in the wild allowing maintainers of packages to merge automatic updates on their maintained packages though which alleviates this a bit.

    have fun understanding why some random package is being installed and/or compiled when you switch to a new configuration.

    It can be mysterious sometimes but once you know the tools, you can directly introspect the dependency tree that is core to the concept of Nix and figure out exactly what’s happening.

    I’m not aware of the existence of any such tools in traditional distros though. What do you do on i.e. Arch if your hourly shot of -Syu goes off and fetches some package you’ve never seen before due to an update to some other package? Manually look at PKGBUILDs?


  • The writer will need to tag things down, to minimal details, for the sake of languages that they don’t care about.

    Sure and that’s likely a good bit of work.

    However, you must consider the alternative which is translating the entire text to dozens of languages and doing the same for any update done to said text. I’d assume that to be even more work by at least one order of magnitude.

    Many languages are quite similar to another. An article written in the hypothetical abstract language and tuned on an abstract level to produce good results in German would likely produce good results in Dutch too and likely wouldn’t need much tweaking for good results in e.g. English. This has the potential to save ton of work.

    This issue affects languages as a whole, and sometimes in ways that you can’t arbitrate through a fixed writing style because they convey meaning.

    The point of the abstract language would be to convey the meaning without requiring a language-specific writing style. The language-specific writing style to convey the specified meaning would be up to the language-specific “renderers”.

    (For example: if you don’t encode the social gender into the 3rd person pronouns, English breaks.)

    That’s up to the English “renderer” to do. If it decides to use a pronoun for e.g. a subject that identifies as male, it’d use “he”. All the abstract language’s “sentence” would contain is the concept of a male-identifying subject. (It probably shouldn’t even encode the fact that a pronoun is used as usage of pronouns instead of nouns is also language-specific. Though I guess it could be an optional tag.)

    Often there’s no such thing as the “default”. The example with pig/pork is one of those cases - if whoever is writing the article doesn’t account for the fact that English uses two concepts (pig vs. pork) for what Spanish uses one (cerdo = puerco etc.), and assumes the default (“pig”), you’ll end with stuff like *“pig consumption has increased” (i.e. “pork consumption has decreased”). And the abstraction layer has no way to know if the human is talking about some living animal or its flesh.

    No, that’d simply be a mistake in building the abstract sentence. The concept of a pig was used rather than the concept of edible meat made from pig which would have been the correct subject to use in this sentence.

    Mistakes like this will happen and I’d even consider them likely to happen but the cool thing here is that “pig consumption has increased”, while obviously slightly wrong, would still be quite comprehensible. That’s an insane advantage considering that this would apply to any language for which a generic “renderer” was implemented.


    It ends like that story about a map so large that it represents the terrain accurately being as big as the terrain, thus useless.

    As I said in the top, you’ll end with a “map” that is as large as the “terrain”, thus useless. (Or: spending way more effort explicitly describing all concepts that it’s simply easier to translate it by hand.)

    I don’t see how that would necessarily be the case. Most sentences on Wikipedia are of descriptive nature and follow rather simple structures; only complicated further for the purpose of aiding text flow. Let’s take the first sentence of the Wikipedia article on Lemmy:

    Lemmy is a free and open-source software for running self-hosted social news aggregation and discussion forums.

    This could be represented in a hypothetical abstract sentence like this:

    (explanation
     (proper-noun "lemmy")
     (software-facilitating
      :kind FOSS
      :purpose (purposes
                (apply-property 'self-hosted '(news-aggregation-platform discussion-forum)))))
    

    (IDK why I chose lisp to represent this but it felt surprisingly natural.)

    What this says is that this sentence explains the concept of lemmy by equating it with the concept of a software which facilitates the combination of multiple purposes.

    A language-specific “renderer” such as the English one would then take this abstract representation and turn it into an English sentence:

    The concept of an explanation of a thing would then be turned into an explanation sentence. Explanation sentences depend on what it is that is being explained. In this case, the subject is specifically marked as a proper noun which is usually explained using a structure like “<explained thing> is <explanation>”. (An explanation for a different type of word could use a different structure.) Because it’s a proper noun and at the beginning of a sentence, “Lemmy” would be capitalised.

    Next the explanation part which is declared as a concept of being software of the kind FOSS facilitating some purpose. The combined concept of an object and its purpose is represented as “<object> for the purpose of <purpose>” in English. The object is FOSS here and specifically a software facilitating some purpose, so the English “renderer” can expand this into “free and open-source software for the purpose of facilitating <purpose>”.

    The purpose given is the purpose of having multiple purposes and this concept simply combines multiple purposes into one.
    The purposes are two objects to which a property has been applied. In English, the concept of applying a property is represented as as “a <property as adjective> <object>”, so in this case “a self-hosted news-aggregation platform” and “a self-hosted online discussion forum”. These purposes are then combined using the standard English method of combining multiple objects which is listing them: “a self-hosted news-aggregation platform and a self-hosted online discussion forum”. Because both purposes have the same adjective applied, the English “renderer” would likely make the stylistic choice of implicitly applying it to both which is permitted in English: “a self-hosted news-aggregation platform and online discussion forum”.

    It would then be able to piece together this English sentence: “Lemmy is a free and open source software for the purposes of facilitating a self-hosted news-aggregation platform and online discussion forum.”.

    You could be even more specific in the abstract sentence in order to get exactly the original sentence but this is also a perfectly valid sentence for explaining Lemmy in English. All just from declaring concepts in an abstract way and transforming that abstract representation into natural language text using static rules.







  • At $200, that’s a great deal.

    It’s IPS, so contrast is quite poor but I’d consider it a great stop-gap until OLEDs are feasible to buy for you.

    Make sure you set the overdrive to “Fast” for the optimal VRR experience.

    I’ll use the rest of my budget to invest in some Ergotron arms

    Note that the “Amazon Basic” branded monitor arm is an Ergotron one but a lot cheaper with no obvious quality deficit. It’s currently holding the monitor I’m typing this on ;)





  • Nvidia has been slowly trying to open a little over the years; first GBM support in the proprietary driver then the open OOT module and finally GSP firmwares for the kernel; allowing an OSS kernel module to exist.

    The OSS graphics community has obviously shown that it doesn’t want Nvidia’s open module (which is tied to the proprietary driver anyways) and would rather build out its own OSS drivers atop an adapted Nouveau/NOVA. Perhaps Nvidia finally realised this?

    I’m sceptical too but for now this appears to be an actually good move from Nvidia?




  • VA has historically been terrible for high framerate, decent for colour accuracy and great for image quality (contrast is so much better on VAs compared to IPS).

    VA panels with decent rise/fall times and not too much overshoot are far and few between. You really have to do your research and even then it’ll be close or even slightly over the refresh cycle target. Only Samsung’s more recent panels are actually good for high refresh; incredibly good actually.


  • I’ve heard that some VA panels can get a bit wonky with text.

    I haven’t. Where did you hear this?

    OLEDs tend to have issues because of their subpixel layout and I don’t know the current state of font rendering support on M$'s stupid OS.

    I can’t be giving people jaundice, I mean. The Acer isn’t exactly perfect either, but it’s good enough.

    If your current cheap monitor is good enough, any monitor of this class will be at least as good.

    If you need proper colours though, you should rather invest in a calibration device. Even good monitors should be calibrated for your specific room conditions if colour accuracy is of importance.

    Whatever I choose will be my daily driver for probably 7+ years.

    Then I’d get a newer OLED that isn’t prone to burn in or wait a few months for said OLEDs to do down in price.

    LCDs are not future proof. The vast majority has no proper support for HDR for starters (HDRn’t).

    I’m concerned that there will always be adjustments and compromises if I go curved.

    Curved isn’t as significant as you imagine it to be. You usually don’t notice it at all.

    Even extreme curvature as is common on Samsungs newer VA panels is only a little noticeable when you actually sit in front; eventhough it looks like a lot from the outside.

    With VA, you actually want curvature as there is somewhat of a significant gamma shift when looking at a horizontal angle and the curvature helps mitigate this effect.

    I really enjoyed 240hz G-Sync smoothness, but I don’t play serious competitive stuff and I could downgrade to 144hz, as long as the other benefits are worth the trade-off. I also think QHD will hover around 180fps in my current games, and UWQHD around 140 maybe. I’d probably only get the full benefit of 240hz QHD in older games.

    Unless you really love playing E-sports games competitively and that makes up most of your gaming time, 144Hz is good enough. Though VRR with LFC is a must, look out for that.

    Do any of you own either of these or similar monitors?

    I’m not too familiar with the current market but what I can tell you is that you must always look up real-world measurements of rise and fall times, overshoot and colour accuracy. Ideally read or watch a review (TFT Central, RTings, HWunboxed/monitors unboxed).

    This is especially important with VA panels as the vast majority use older tech that can have very slow rise and fall times that are often not actually sufficient for high refresh rates and/or bad overshoot. You need to filter these out.

    I would not buy a monitor without having seen real-world measurements of rise and fall times aswell as overshoot.

    If you have other recommendations in the $450-600 range

    HW unboxed usually has many “current” monitors for comparison in their charts. I can highly recommend watching a few of their reviews.


    I personally bought a UWQHD Nano IPS panel (LG 34GN850) after attempting to buy a working Samsung G9 VA UUWQHD twice a few years ago (yay Samsung QC…). It’s decent but I wouldn’t buy it again nowadays. I really miss the contrasts of the old (S)VA panel I had before. Decent VA panels have ~3000:1 ration while rather good IPS panels only have ~1000:0; it’s really that much of a difference.

    I’d only buy IPS if I couldn’t find a VA with fast enough transition times for my specific constraints or desperately needed a colour accurate display.

    These days, I’d buy LG WOLED or Samsung QD-OLED (or wait for them to go down in price).