• 0 Posts
  • 60 Comments
Joined 6 months ago
cake
Cake day: January 13th, 2025

help-circle
  • Really the first issue is your IP address. How does your ISP hand out IP addresses IPv4 and/or IPv6?

    If you have an ISP that gives a static block of IPv6 addresses that simplifies things immensely. But also consider that many legacy, monopoly ISPs have not implemented IPv6 for their customers, especially in the US, and so domains without an IPv4 address aren’t accessible from people’s homes that use those ISPs. But it means you could assign static IPv6 addresses to each service if you wanted to and add subdomains for each. Then you just need to deal with security on that system.

    Otherwise you’ll likely need to deal with dynamic DNS. If your router and your domain registrar’s DNS can work together for DDNS that’s ideal. For example, my OpnSense router updates my cloudflare registered domain directly when my ISP changes my IPv4 address (I have one of those ISPs that doesn’t assign IPv6 still but I don’t have any choice if I want > 5-10Mbps upload speeds).

    Then you need to deal with routing. The best way is with a reverse proxy like Caddy or I actually like Traefik a lot because it works well with my complex setup with docker and kubernetes among other things. Basically your router needs to route all the inbound traffic on the appropriate inbound ports to the reverse proxy to it to then route to the appropriate service based on the subdomain and/or port of the request.

    Once you route the subdomain to the appropriate service you need to deal with security. Once a service is exposed, it’s going to eventually start getting hit by bots trying to access it. Best to implement something like fail2ban to stop them from wasting your processing power with failed logins and 404 errors and such.


  • I set up separate VLANs for devices that do or don’t get filtering with different DNS servers assigned. And I have two different wifi SIDs on my access point for the different VLANs as well as having ports on my primary switch aligned to one or the other VLAN. I did end up having one other switch that has devices from both VLANs in a different area and had to set up one port on the primary switch with a couple of MAC-based filters for assigning the VLAN for just devices on that remote switch, but those are static devices, so that wasn’t an issue. I don’t attach any other devices to that.




  • Pixel 9a has some issues with performance, currently. They used older storage tech for the 9 and 9a than other devices and not enough memory for all the “AI” features that are tracking everything you do to make things more convenient. There are a few articles out there related to some ways to improve performance a bit by disabling some background apps that you may not be using. It’s also possible future updates from google may fix some of whatever is causing the issues for many users.

    But it’s not an endemic Android issue, at least not modern versions in my experience. I use GrapheneOS on a Pixel 7 Pro currently and just grabbed a couple of 10+ GB zip files I had on an old dropbox account and unzipped them with the fossify file manager. It was basically instant. Took longer to download them than unzip.

    As I mentioned, your best bet is to use ADB or similar and monitor what applications are eating up resources and try to free some up. Especially any apps thrashing the storage or filling memory. That’s assuming you have already uninstalled any bloatware and rebooted recently to make sure no bad apps are stuck.


  • Bottleneck is usually storage speed rather than processing power. If you have a device that can use external sd cards and your device supports high-speed cards, that might help, though if the controller for sd cards is slow, that might just end up a worse bottleneck. But that’s just a guess and it definitely could be that your memory is not sufficient or background apps are eating up processing, such as crypto-mining malware just as an example. You can check resources over adb while unzipping or try some benchmarks to determine your issue.

    Anecdotally, I have no issues on my Pixel 7 Pro and never had issues on past Pixel or Nexus phones I’ve owned (generally higher end models with plenty of memory and storage space). Pixel devices don’t include sd card slots so this is all on internal storage in those cases.

    Sure anything is likely to take longer on a phone than on a laptop or desktop, but shouldnt be that significant of a difference unless there’s a hardware bottleneck or other apps are using all the resources.


  • I’ve used java Scanner objects to do this extremely efficiently with minimal memory required even with multiple parallel searches. Indexing is only necessary if you want to search for information many times and don’t know what exactly the search will be. For one time searches, it’s not going to be useful. Grep honestly is going to be faster and more efficient for most one time searches.

    The initial indexing or searching of the files will be bottlenecked by the speed of the disk the files are on, no matter what you do. It only helps to index because you can move future searches to faster memory.

    So it greatly depends on what and how often you need to search and the tradeoff is memory usage, but only for multiple searches of data you choose to index from the files in the first pass.










  • Problem with Manjaro is they have their own opinionated repository that is not always in sync with Arch because they try to introduce more “stability”. I found this actually caused the opposite in most cases as there are a lot of dependencies that end up being behind and so you can’t install more stable versions of a lot of software. With the complexity of modern software dependencies, it has become a big problem. Also, they have in the past caused lots of problems with AUR and have let their SSL certs expire multiple times. Overall, they just haven’t been reliable IMHO, so I moved to Fedora a while back.





  • LLMs are perfectly fine, and cool tech. Problem is they’re billed as being actual intelligence or things that can replace humans. Sure they mimic humans well enough, but it would take a lot more than just absorbing content to be good enough at it to replace a human, rather than just aiding them. Either the content needs to be manually processed to add social context, or new tech needs to be made that includes models for how to interpret content in every culture represented by every piece of content, including dead cultures who’s work is available to the model. Otherwise, “hallucinations” (e.g. misinterpretation and thus miscategorization of data) will make them totally unreliable without human filtering.

    That being said, there many more targeted uses of the tech that are quite good, but always with the need for a human to verify.