1. 01 Nov, 2016 1 commit
  2. 31 Oct, 2016 2 commits
    • Jason Rhinelander's avatar
      Add progress to creativity-data · a2083b54
      Jason Rhinelander authored
      The progress is displayed both in the error output and in the process
      name shown in top (via prctl).
      a2083b54
    • Jason Rhinelander's avatar
      Simulation data: make book numbers per capita · 3b90b9cb
      Jason Rhinelander authored
      books_written, books_bought, books_pirated, and books_public are all not
      particular useful as total numbers because, ceteris paribus, a larger
      simulation will have larger values of all of them.
      
      This changes them all to per-capita values (the first changes to a
      per-100-readers value, the rest to per-reader averages).
      3b90b9cb
  3. 30 Oct, 2016 2 commits
  4. 29 Oct, 2016 4 commits
  5. 27 Oct, 2016 3 commits
    • Jason Rhinelander's avatar
      Removed debugging · fc9ea94f
      Jason Rhinelander authored
      fc9ea94f
    • Jason Rhinelander's avatar
      gcc-4.9 fix · 68e29e88
      Jason Rhinelander authored
      Sadly, libstdc++ doesn't support moving a stringstream before v5, so
      work around it by accepting a unique_ptr instead.
      68e29e88
    • Jason Rhinelander's avatar
      creativity-data: added file preloading · bc62d7ce
      Jason Rhinelander authored
      Right now processing is limited by the fact that individual threads are
      trying to read/decompress different files at once, and don't read from
      disk until they start working.  By using a single thread to load files
      from disk into memory, then having individual worker threads load from
      memory, the disk contention bottleneck should be reduced: one process
      ends up reading files from disk as fast as possible, but sequentially,
      and so while the disk may still bottleneck the process (particular if
      the disk is slow or the CPUs are numerous and very fast), that
      bottleneck should be reduced by reading everything sequentially in one
      thread.
      bc62d7ce
  6. 25 Oct, 2016 2 commits
  7. 16 Oct, 2016 4 commits
  8. 25 Sep, 2016 12 commits
    • Jason Rhinelander's avatar
      Tweak the default catch parameters · 19ebbfbe
      Jason Rhinelander authored
      The defaults were far, far too small for a catch tax value from 1-100
      (at 100, mu would be -35, with a standard deviation of 1, i.e. almost
      everyone gets caught all the time).
      
      Changed it to mu = 10 - 0.08 (so that at 100, the probability of getting
      caught (even if innocent) is around 5%, and neglibible at spending of 1.
      Sigma now also decreases in spending (i.e. detection gets more accurate,
      particularly at the high end): from 3 at spending ~ 0, down to 1 at
      spending of 100.
      
      Increased penalties to an escalating fine of 50, 100, 175, 275, ...
      instead of a fixed cost (at a fixed cost only the increase in
      probability matters, but once you're almost certain to be caught, you
      might as well pirate away).
      19ebbfbe
    • Jason Rhinelander's avatar
      Revert previous commit and use another fix · 9ab6363b
      Jason Rhinelander authored
      The previous commit wasn't sufficient: the reader could have also chosen
      to keep other books to the market, and *those* costs could cause income
      overexpenditure.
      
      This commit fixes the issue by simply making sure the reader always
      takes the market cost into account when creating if creation could be
      instantaneous (i.e. if creation_time is 0 *or 1*).
      9ab6363b
    • Jason Rhinelander's avatar
      Avoid possible insufficient assets for tax · a5084f38
      Jason Rhinelander authored
      Authors don't take the market cost into account when considering the effort limits if
      creation_time is > 0, but if creation_time == 1, it's possible that
      release happens immediately; in that case, the (unexpected) market cost
      being removed from assets could result in assets not being sufficient to
      cover due taxes, which would then hit a negative bundle error.
      
      This fixes it by delaying the release for a period if current assets are
      not sufficient to both cover the market cost and pay taxes (since the
      author won't be creating next period, they should have funds available
      at that point to cover both).
      a5084f38
    • Jason Rhinelander's avatar
      Finish removing debugging · a446df90
      Jason Rhinelander authored
      An earlier commit removed the lower half of this debugging; remove the
      upper half, too.
      a446df90
    • Jason Rhinelander's avatar
      random: flush output · 09a12244
      Jason Rhinelander authored
      The "Executing ..." line is being lost from ssh connections, I suspect
      because of the lack of flushing.
      09a12244
    • Jason Rhinelander's avatar
      Enhanced intra-optimize comments · b3108e15
      Jason Rhinelander authored
      b3108e15
    • Jason Rhinelander's avatar
      Fix reader misoptimization · 72f25386
      Jason Rhinelander authored
      Readers were buying far too often, and thus writers were responding by
      writing far too often, resulting in most simulations having negative
      utility all the time.  The issue was introduced in the intra-optimize
      rewrite in e23ef96e.
      72f25386
    • Jason Rhinelander's avatar
      Add some randomness to creation_time · 13adef79
      Jason Rhinelander authored
      Attempt to address cyclical behaviour, part 2.
      
      One reason for cyclical behaviour is that the top creators all want to
      create as much as possible, but when creation_time > 0 this means they
      are cycling and all end up creating at the same time (this is amplified
      by the fact that they start using beliefs at the same time).
      
      This adds some randomness to the creation time, so that actual creation
      time is creation_time + [-1,0,1] with equal probability.  (This only has
      an effect when creation_time >= 1, of course, because we can't apply a
      mean-preserving spread while maintaining mean 0--and anyway the cyclical
      behaviour won't happen when people can create every single period).
      13adef79
    • Jason Rhinelander's avatar
      Remove "." from filename output · 04932864
      Jason Rhinelander authored
      Having the . is annoying when trying to copy and paste the filename
      (double-clicking the filename to select it also selects the period).
      04932864
    • Jason Rhinelander's avatar
      Use mkt books over creation cycle, not previous period · 46ce587f
      Jason Rhinelander authored
      Many simulations have very strong creation cycles, especially in
      pre-piracy periods, due to the use of a single lag: if creation_time is
      1, for instance, there could exist an equilibrium of
      (high,low,high,low,high,low,...) because the high number of books in the
      previous period induces low creation undertaking this period, then the
      resulting low mkt books in the next period induces high output in the
      following one.
      
      This fix is an attempt to address the cyclical behaviour by smoothing
      out the belief to using the average over the last `creation_time+1`
      periods.
      
      This is unlikely to eliminate all cycles, however: some cycles happen
      because talented creators create as often as possible, and because
      beliefs begin at the same time, they are all following the same action.
      46ce587f
    • Jason Rhinelander's avatar
      info: fix --policy argument output in -c mode · 211602b1
      Jason Rhinelander authored
      It was misnamed --policies, and had no space before the initial dash.
      211602b1
    • Jason Rhinelander's avatar
      bb069227
  9. 21 Sep, 2016 4 commits
  10. 20 Sep, 2016 2 commits
  11. 19 Sep, 2016 1 commit
    • Jason Rhinelander's avatar
      catch-and-fine work in progress · fb3b948a
      Jason Rhinelander authored
      Added CopyrightPolice agent to handle catch-and-fine policy.
      
      Various smaller changes and additions to make the above work.
      
      Moved the first field of `policy_catch_fine` into its own field,
      `policy_catch_cost`.  It also gets a slightly different interpretation
      now: it is the cost incurred (but *not* the fine amount) by being
      accused; this amount is simply lost forever, while the fine gets
      redistributed.
      
      Changed the redistribution mechanism: redistribution now goes to
      authors whose books were infringed, not to the population at large.
      fb3b948a
  12. 18 Sep, 2016 2 commits
  13. 02 Aug, 2016 1 commit