• 0 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: July 18th, 2023

help-circle


  • vyatta and vyatta-based (edgerouter, etc) I would say are good enough for the average consumer.

    WTF? What galaxy are you from? Literally zero average consumers use that. They use whatever router their ISP provides, is currently advertised on tech media, or is sold at retailers.

    I’m not talking about budget routers. I’m talking about ALL software running on consumer routers. They’re all dogshit closed source burn and churn that barely receive security updates even while they’re still in production.

    Also you don’t need port forwarding and ddns for internal routing. … At home, all traffic is routed locally

    That is literally the recommended config for consumer Tailscale and any mesh VPN. Do you even know how they work? The “external dependency” you’re referring to — their servers — basically operate like DDNS, supplying the DNS/routing between mesh clients. Beyond that all comms are P2P, including LAN access.

    Everything else you mention is useless because Tailscale, Nebula, etc all have open source server alternatives that are way more robust and foolproof to rolling your own VPS and wireguard mesh.

    My argument is that “LAN access” — with all the “smart” devices and IoT surveillance capitalism spyware on it — is the weakest link, and relying on mesh VPN software to create a VLAN is significantly more secure than relying on open LAN access handled by consumer routers.

    Just because you’re commenting on selfhosted, on lemmy, doesn’t mean you should recommend the most complex and convoluted approach, especially if you don’t even know how the underlying tech actually works.


  • What is the issue with the external dependency? I would argue that consumer routers have near universal shit security, networking is too complex for the average user, and there’s a greater risk opening up ports and provisioning your own VPN server (on consumer software/hardware). The port forwarding and DDNS are essentially “external dependencies”.

    Mesh VPN clients are all open source. I believe Tailscale are currently implementing a feature where new devices can’t connect to your mesh without pre-approval from your own authorized devices, even if they pass external authentication and 2FA (removing the dependency on tailscale servers in granting authorization, post-authentication).



  • I believe this is what some compression algorithms do if you were to compress the similar photos into a single archive. It sounds like that’s what you want (e.g. archive each day), for immich to cache the thumbnails, and only decompress them if you view the full resolution. Maybe test some algorithms like zstd against a group of similar photos vs individually?

    FYI file system deduplication works based on file content hash. Only exact 1:1 binary content duplicates share the same hash.

    Also, modern image and video encoding algorithms are already the most heavily optimized that computer scientists can currently achieve with consumer hardware, which is why compressing a jpg or mp4 offers negligible savings, and sometimes even increases the file size.













  • WhatAmLemmy@lemmy.worldtoAsklemmy@lemmy.mlSearch engines down?
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    4 months ago

    I was thinking about this and imagined the federated servers handling the index db, search algorithms, and search requests, but instead leverage each users browser/compute to do the actual web crawling/scraping/indexing; the server simply performing CRUD operations on the processed data from clients to index db. This approach would target the core reason why search engines fail (cost of scraping and processing billions of sites), reduce the costs to host a search server, and spread the expense across the user base.

    It also may have the added benefit of hindering surveillance capitalism due to a sea of junk queries from every client, especially if it were making crawler requests from the same browser (obviously needs to be isolated from the users own data, extensions, queries, etc). The federated servers would also probably need to operate as lighthouses that orchestrate the domains and IP ranges to crawl, and efficiently distribute the workload to client machines.



  • a) why the fuck would they go to that effort for a filthy commoner like yourself, and b) what are the chances that 0.01% of recoverable data contains anything useful!?!

    Nobody is gonna bother doing advanced forensics on 2nd hand storage, digging into megabytes of reallocated sectors on the off chance they to find something financially exploitable. That’s a level of paranoia no data supports.

    My example applies to storage devices which don’t default to encryption (most non-OS external storage). It’s analogous to changing your existing encrypted disks password to a random-ass unrecoverable throwaway.