• 2 Posts
  • 22 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle


  • Debian is community run, which often means all changes and features get implemented because the community wants that, not some corporation. One notable example of that is Snap.

    Also, I found (minimal install) Debian a bit more minimalist than Ubuntu server, which is great imo. I just want the bare minimum for my services to work, and pretty much the only thing I expect from my server to have is SSH and Docker.



  • My recommendation would be to use Logseq.

    It’s similar to Obsidian (“Second Brain”/ PKM), but with the journal function as backbone.

    It relies heavily on crosslinking, is markdown-based, very efficient and a joy to use once you “got” it, and supports a hell lot of features, including TODO, plugins, a knowledge network (“graph view”) and much more.

    I use it for everything (external brain) and pretty much never loved a piece of software this much!
    It sounds like it is THE tool you’re searching for!



  • Because containers (Distrobox, Flatpak, etc.) are bae.
    You can read my post I made a while ago for more information: https://feddit.de/post/8234416

    Once you “get” image based distros, you probably never want to go back. Traditional distros just feel… off now for me.
    Containerisation is the biggest strength in Linux, we use it all the time on servers, so why not on the desktop?
    Atomic OSs just make more sense for me, not only because of security/ bug/ whatever reasons, no, also because they feel simpler and are pretty convenient and robust.




  • There’s a big shift happening right now, you’re right on that.
    Traditionally, ARM is not as capable in solving complex issues, but more efficient.

    That’s why it has always been used on smartphones for example. You want a lot of battery and don’t need to do highly complex stuff on that, that’s what you have your PC for.

    The big focus in the last years has always been to top the competitor in terms of performance, and only right now, people begin to question if the computing power they have right now isn’t enough and if they rather wouldn’t like to have a device that’s more efficient.
    The tradeoff is, you’re more limited to this specific architecture. Apple solved this by making a compatibility layer for x86 apps, but that of course comes with a performance hit.

    I’m no expert in that topic tho, so take all I said with a lil grain of salt.

    Right now, I think you’re better off with x86, because your server will definitely run on some sort of Linux, and we don’t have any compatibility layer or something like that yet.


  • Where I live, electricity is also very expensive. I monitor every watt.

    I asked the same question half a year ago, here’s what I’ve learnt: RPis tend to be less reliable and aren’t that energy efficient. They’re great for small appliances, but for servers (e.g. NAS) not as much.

    Get an used Thinclient/ mini PC. They cost something between 50-150€ and give you a huge performance boost, more ports, a x86 architecture, are better repairable (still often bad) and more.

    Mine uses about 10-15 W on normal use, and 20 rarely when my cloud is under heavy use.


  • I think, in general, there’s now a bigger gap between amateurs and professionals.

    Amateurs, who just want some easy snapshots or video recordings, who used to buy said camcorders or compact cameras, are extinct. They now use their newest Pixel or iPhone, which provide a good quality for their price, are very simple (algorithms do the dirty work for you + easily accessible UI) and are with them all the time.
    They provide a decent quality, and most people don’t even notice that at all to begin with.

    Professional photo- and videographers on the other hand spend a lot of money for equipment. They want every tiny bit of quality for their work, and often don’t care if a camera costs 1500 or 2000 bucks.

    Companies noticed that and now only offer two classes: the “phone with good camera” for casual photography, and “fucking expensive equipment” stuff for pros.

    We both are the rare exception. I also just bought a compact camera recently, because I don’t like photographing with my phone.
    We are a dying breed.


    The real question is, why do you want 4k?
    The sensor-/ image quality is way more important.
    4k is just the amount of pixels and is useless if the sensor doesn’t get enough light.
    You can still have very bad quality and shoot in 4k. The drawback is a lot of wasted memory.

    The only reason, imo, to get one is if you shoot for very high res screens or crop a lot in post pro.
    But 4k AND good hardware is pro-teretory.


    You have 3 options:

    • Keep your budget below 300 bucks and use a phone, old used camera or cheap device.
    • Adjust your budget to ~400-600 €/$ and get a very solid middle ground device, like a Sony RX100 III/ IV/ V. I belive the III shoots FHD, and IV and up can shoot 4k. They are a solid option and yield a good result.
    • Or spend a lot of $$$ and get something very high quality, which would be completely out of budget for you.

    If I were you, I would get a used higher quality camera.
    They still perform great and will hold a few years into the future, while also being less expensive than new devices.

    My honest advice is: if you are the guy who has to shoot weddings, either don’t do it and let the pros do it for this special day, or get good equipment that lasts you for the next years. Getting a camcorder right now sounds like a waste of money in my opinion.

    Edit: I have said RX100 III, that doesn’t shoot 4k. If you want, I can send you an example video I could take for you and then you decide if FHD is enough for you.


  • Very lovely, thank you for your awesome guide! I wanna see way more of those kinds, very helpful and straight to the point.


    I already developed this exact style by accident and I’m using it most of the time for my pics.
    I have a 1/4 or 1/2 black mist filter strapped on and do following post-processing steps:

    • decrease contrast
    • increase brilliance
    • add some grain
    • stronger filmic RGB with blacks lifted, and too strong lights
    • and then increase or decrease the strongest color in the color spectrum thingy (I don’t know the english name for it, sorry).

    Here are a few recent examples: DSC05701 DSC06212 DSC05699 DSC05737 DSC06126




  • Dude… It’s the hundredth time you’ve posted this copypasta.
    Image-based OSs aren’t locked down and also don’t depend on proprietary services.

    You can just read my post I made about immutable systems, maybe we can discuss it there.

    But, I wouldn’t choose a image based OS right now too for servers. At least yet.
    I’m just afraid about compatibility, because many installers and services might rely on access to the root file system for now. Debian is right now the best choice as server OS, but that might change in the future.


  • Guenther_Amanita@feddit.detoPhotography@lemmy.worldColors
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    8 months ago

    Looks great! Lovely composition and framing!

    If I’m allowed to criticize: I just think the face expression could be a tiny bit better, because it looks a bit out of place. Kind of like you weren’t allowed to scratch your ear while shooting 🌝

    Also, how did you create that bloom effect and grain? Fits the whole look very well!



  • I don’t know what’s your intention.
    I’m no expert or highly qualified in any way, so please correct me, but I don’t know if that’s the right way.

    LLMs usually need lots of computing power, optimally in form of a GPU.
    I use GPT4All, and when I send a prompt, I notice the temps/ fan speed and usage of my GPU turning up instantly to almost 100%. If it’s a longer one, my PC sounds like a helicopter 😁

    In terms of hosting a server, you want something barely good enough for your service, e.g. running your cloud. This results in way less power draw, which is what you want, since it runs 24/7. Something powerful enough to run LLMs comfortably would likely draw lots of power, even an Apple Silicon.

    I think, you’re better off just using GPT4All on your gaming PC if you need it.

    I hope I’m wrong, and that M1s draw barely any power, especially in idle.
    And even if I am, they (almost) can only run MacOS, which wouldn’t be a good server OS.



  • Small critique: I think you overdid it a bit with the contrast. Don’t get me wrong, on a B&W-image that’s good, it helps letting the shapes of the wonderful cloud formations pop out. But maybe decrease the intensity of the darks, that would help in making it a bit less apocalyptic 😁
    Also, maybe crop out the last bottom millimeters or retouch them. The trees/ lights look a bit out of place and steal the attention from the main subject.

    Other than that, I love it! Keep going! 🙌