• 0 Posts
  • 97 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle

  • I got one of those too. I called the customer service to get another path home because of disturbances, and they just have robot answering. The robot started halfway through the call just reading pure json at me, and then said “to get this information as a message press 1” or something. This is what I got:

    Here is your journey from undefined to undefined: BUSS 506 towards Karolinska sjukhuset 09:36 from undefined 10:18 arrived at undefined. Link to your journey.







  • You are not uploading, others are downloading. The direction of traffic is the same, but the incentive is different.

    Before torrents, there were services where you just announced “there are all my files just download what you want” and some places had quotas on how much you should make available for download. People were sharing all kinds of random trash and also downloading that trash.

    A joke was – was it on like bash.org? – that you have some file you want to backup, you have to make people want to download it, so you just call it leaked_celebz_nudez.zip or something and it would be downloaded forever, even if it did not work because so much trash was shared and if could help your quota.

    So you’d have to use the strategy in that joke. Make a torrent of your stuff, just encrypted so it is trash for anyone who wants to download it, but you’d have to make the torrent content so imcredibly desireble that even if the comments and ratinfs of the torrents goes to shit, people will still be downloading it in the hope it works. What would be that desireable though, I do not know.













  • I’ve been thinking of a dreaming-like algorithm for neural networks (NN) which I have wanted to try.

    When training an NN, you have a large set of inputs and corresponding desired outputs. You make random subsets of this and for each subset you adjust the NN to correspond more to the outputs. You do this over and over and eventually your NN is close to the outputs (hopefully). This training takes a long time and will only be done this initial time. (This is very a simplified picture of the training)

    Now for the dreaming. When the NN is “awake” it accumulates new input/output entries. We want to adjust the NN to also incorporate these entries. But if we use only these for training we will lose some of the information the NN has learned in the initial training. We might want to train on the original data + the new data, but that is a lot, so no. Lets assume we do no longer even have the original data. We want to train on what we know and what we have accumulated during the waking time. Here comes the dreaming:

    1. Get an “orthogonal” set of input/outputs of what the NN already knows (e.g. if the network outputs vectors, take some random input, save vector. Use a global optimization algorithm to find the next vector such that is orthogonal to the first. Do this until you have a spanning set).
    2. Repeat point 1 until you have maybe one set per newly accumulated input/output entry, or however much appears to not move you too far from the optimization extrema your NN is in – this set should still be a lot smaller than the original training set.
    3. Fine-tune train your NN on the accumulated data and this data we have generated. The generated data should act as an anchor, not allowing the NN deviate too much from the optimization extrema and the new data will also be invorporated.

    I see this as a form of dreaming as we have a wake and sleep portion. During waking we accumulate new experiences. During sleeping we incorporate these experiences into what we already know by “dreaming”, that is make small training sessions on our NN.