

12·
1 day agoIt answers the call behind the scenes without you needing to hear or listen. It pops up a notification saying that the call is ongoing with the screening AI, and you have an option to immediately stop the call or to pick it up. It also records a transcript of the call for you to listen to afterwards. It also doesn’t screen every call, if it’s a number in your contacts it lets it through to you as normal.
I think this is quite a bad idea even if we totally set aside any ethical concerns with AI, solely because it increases the hardware requirements to run a Lemmy instance. I believe that a critical goal of federated services should be to reduce the barrier to entry for instance ownership as much as possible. The more instances the better. If there’s only two or three big ones, the problems of centralization appear again, albeit diluted. The whole point of federation is to have multiple instances. Already many survive on donations or outright charity. But AI increases costs immensely.
I think it’s fine to add features that require more compute power if they have a vast improvement to user experience for the compute required. But AI is one of the most computationally intensive features I can think of, and the ratio to its value addition is particularly low. There’s so little content on Lemmy that you can feasibly view the entire post history of most communities in under a day of browsing, so there’s no real need for improved searchability - it’s just not that big here yet. And even when it does get that big, I think a strong search algorithm would be just about as effective, much more transparent, and most importantly not require instance owners to add GPUs to their servers.