Archive for the ‘tmsr’ Category

Final Selected Parts For My First Computer

Friday, November 29th, 2019

After previously picking parts for my computer I discovered bestcomputersa's list of items on their website was completely incosistent with their actual stock. diana_coman decided the best option for me was to follow my original plan of ordering parts from the states. But the day before I gave up sourcing from Costa Rican stores, the rep from pcgamingcr responded to messages I had sent him a few days prior. After that initial delayed reply, he was constitently responsive through Whatsapp. Pcgamingcr had the coveted AMD FX-8350 with compatible motherboards and video cards. I managed to order everything I needed1 from them and cococo. The guts of the computer cost $1,123. The I/O devices and accessories totaled $1,259 bringing the final cost to $2,382. The items I bought are listed below.2

I. Guts

CPU AMD FX-8350 (CPU/MB/VC combo = 273mil, $486)
Motherboard GA-970A-UD3P (rev 2) (see CPU)
Graphics Card Radeon RX-550 Sapphire (see CPU)
RAM 2x Corsaair vengeance 8GB ddr memory 1600 MHz (89mil, $158)
PSU Seasonic Focus Plus Gold 850 (95mil, $169)
Primary SSD 1TB Samsung 860 Evo SSD ($140)
Backup Mechnical Drive 64MB 1TB Seagate Barracuda (30mil, $53)
Case Corsair Carbide Spec 06 (58mil, $103)
SD Card Reader Lector de Memoria Interno Xtech3 (8mil, $14)

II. I/O + Accessories

Monitor Dell 24 Monitor: P2419H (166.5mil, $292)
Keyboard Ergodox. ($325)4
Mouse Marvo Scorpion5 (24mil, $43)
UPS UPS APC SMT15006 (319.5 mil, $569)
Thermal Paste 4x 2g Arctic MX-4 Thermal Paste ($30)

  1. Save the Samsung SSD and thermal paste that thimbronion graciously offered to bring from the states + the ergodox keyboard. []
  2. Prices are listed as (colones, usd) with an exchange rate of 562 colones to the usd. If I bought the item with usd directly then I list only the usd price. []
  3. I could not find a link to the spec details on xtech's website. It can read USB, SD, Micro SD, XD, MMC. I am not quite sure what an XD or MMC is. []
  4. In addition I expect to pay a yet unknown import tax. []
  5. Pcgamingcr did not tell me the exact model of mouse, I was looking only for a cheap option. The mouse came with a keyboard I can use while I wait for my fancy Ergodox to get here from Taiwan. []
  6. Recommended Replacement Batteries - Optima Batteries 8052-161 D31M BlueTop Starting and Deep Cycle Battery []

Candidate components for whaack's first build

Monday, November 18th, 2019

Below is the prospective parts list for the machine I intend to use as my work station. To produce this list I first read through kitchentablecomputer's "Computer Parts" section. After reading the various buying guides, I started picking components based around the processor from the machine on which diana_coman installed cuntoo. The parts also had to be available on bestcomputersa. Once I made my list I checked the other store recommended by handbot, cococo.co.cr, for better options for the various parts. I only made one change: I replaced my ₡147,500.00 480GB Kingston SSD with a $140 1TB Samsung 860 Evo SSD + a ₡36,500.00 1TB Seagate Barracuda mechnical backup hard drive1 The price for all the guts (not including S+H, taxes, etc.) comes out to ₡623,500.00 + $140 , ~ $1215.

Further work includes making a buy list for I/O devices,2 miscellaneous parts,3 and building tools.4


CPU

₡ 142,500.00

Amd Am3 Fx8350, spec

Motherboard5

₡ 64,500.00

Gigabyte Ga-970A-Ds3P, spec

RAM

₡ 44,000.00 x2

Corsaair vengeance 8GB ddr memory 1600 MHz x2, spec

Graphics Card

₡ 120,500.00

GTX 750ti ddr5 4GB, spec

Hard Drive

$ 139.99

1TB Samsung 860 Evo SSD

Backup Internal Mechanical Hard Drive:

₡ 36.500,00
DD 1TB Seagate Barracuda SATA 64MB 3.5 7200RPM

PSU 6

₡ 115,000.00

Corsair RM850x plus gold, spec

Cooling 7

₡ 12,500.00

Corsaair-fan-air-series-af120, spec

Case 8

₡ 44,000.00

Corsair RED LED Mid Tower Gaming Case, spec

  1. It was coincidence that the best SSD option I found for myself is the same SSD diana_coman used in her machine. The Seagate Barracuda is a mechnical drive, which should have been obvious given the specs I listed. []
  2. Most notably the monitor []
  3. Such as an ethernet cable []
  4. Such as a screwdriver set and an anti-static device. []
  5. ATX Form Factor; 30.5cm x 21.5cm.

    Note: I have to confirm with bestcomputersa the version of the BIOS is from post 2013, otherwise the motherboard will not support the FX-8350. []

  6. I need to check this fully modular PSU comes with a sufficient number of the correct cables to connect to every power drawing component I have. []
  7. I still need to find thermal paste. []
  8. 447mm x 200mm x 428mm []

A helpful han(d)bot

Friday, November 15th, 2019

I am working on acquiring a few items in Costa Rica. My goal is to build an ergonomic battle station equipped with a computer I've personally put together. Acquiring the iron necessary for my desired machine is difficult given my location on the beach. So per the advice of diana_coman, I hopped into #trilema-hanbot to speak with a lord keen on becoming accessible to those within tmsr.

hanbot es una experta en las cosas costarricenses.1 Upon asking for advice, hanbot immediately shared a list of stores she had handy. For computer parts she recommended best computers sa as well as the more modestly named cococo. For office supplies2 hanbot sent me to muguisa. These tiendas appear to have what I need.

hanbot also gave some advice regarding couriers I may want to use, should I decide to import goods from the evil empire. She noted that items do arrive reliably, but the companies she has seen are either expensive or require a credit card. hanbot made it clear that she will under no circumstance use the magic plastic. I didn't respond to this point during our conversation,3 but her conviction made me think.4

Getting advice from hanbot saved me hours of searching that may have turned out fruitless. It's becoming obvious I should ignore my initial belief that I shouldn't burden those who know more than me with questions. It is just too expensive to redo work that is avoidable via a short conversation. That said, thank you hanbot!

  1. Whose knowledge goes beyond where to buy office supplies. She also had advice to give regarding purchasing a car. []
  2. I specifically need a proper chair; I hired a local handyman to build a desk tailored to the layout of my apartment. []
  3. How was I to respond? "Ah yes, totally agree with you there hanbot, credit cards are for idiots and its our moral obligation to separate them from their money. However, they happen to be just too convenient, so I use them anyways." []
  4. The first question I wondered was - how would hanbot's recommended stores charge for items they send out for delivery given they don't require card and are fiat based? I answered this myself by ordering a chair through Muguisa. Upon placing my order, Muguisa gave me a bank account number. They informed me that they will send me the chair when I deposit cash into their account at a local bank. Now I know how ordered goods are paid for in a cash based economy.

    The second question I still have is - would it be wise to cut my credit card in half? On a personal level, the credit card is a spiritually draining item. It whispers in your ear, don't worry about saving for a rainy day my friend, for I am always here to extend you credit shall you need it. On a systemic level, credit cards wreak havoc on the economy. They destroy capital allocation by burdening everyone with even more money. I would love to get rid of my card, perhaps in style., but I don't know if that would be prudent. []

Proper HTML Linking, A Battlefield Report

Friday, November 1st, 2019

Having been rightfully flamed for attempting to use a tool I did not understand, I return from the dark hell of the html & php mines with a tiny nugget of information that I hope will aid the republic.

There are two quirks I've noticed with the select displayer.

1. The select displayer can match the values provided in query parameters b/e to text inside of an html tag. This problem emerges from user error, but often one wants to match to text in a link that contains equivalent text in its opening anchor tag. It is currently impossible to select the second "trilema" that follows the "http://trilema.com" in the example:1

<a href="http://trilema.com">trilema</a>

My solution is to find the first match not positioned inside of a tag.2

function first_pos_not_in_tag($hay, $needle, $start) {
  $max_attempts = 2;
  $guess = $start; // Must be > 0 for the while loop condition.
  $length = strlen($hay);
  while ($max_attempts > 0 && $guess && $guess < $length) {
    $guess = strpos($hay, $needle, $guess);
    $next_close_pos = strpos($hay, ">", $guess);
    $next_open_pos = strpos($hay, "<", $guess);
    if ($next_close_pos >= $next_open_pos)
      return $guess;
    $guess = $next_close_pos+1;
    $max_attempts--;
  }
  return false;
}

You must alter your server_side_selection function

--- $b_pos = strpos($content,$_GET["b"]);
--- $e_pos = strpos($content,$_GET["e"], $b_pos);
+++ $b_pos = first_pos_not_in_tag($content, $_GET["b"], 1);
+++ $e_pos = first_pos_not_in_tag($content, $_GET["e"], $b_pos);

2. The second quirk is the select displayer often spits out faulty html. For example, the displayer provides no closing </span> if the user leaves the value for e empty.3 This doesn't seem to cause any practical issues; browsers close spans automatically under certain conditions that I have not fully ascertained.

  1. The root of the problem is the displayer does not provide a means to match to the second occurrence of text; there is no way to select only the last duck in duckduckduck. []
  2. This doesn't fix the root problem stated above, but it prevents the select displayer from breaking tags. This is especially useful for stopping other servers' automatically provided b & e values - used to link back to your excerpt when you send them a pingback - from mangling your html tags. []
  3. This may come as a surprise, because some browsers (I've seen chrome) will silently provide a closing </span> where they see fit and will show that inserted </span> in their "view source" tool! []

VPY Annotations

Sunday, October 27th, 2019

Below is asciilifeform's original prototype V with my annotations.

#!/usr/bin/python

##############################################################################
# Quick Intro:
# 1) Create '.wot' in your home directory. Fill it with public keys from 'wot'.
# 2) Create '.seals' in your home directory. Place all signatures there from 'sigs'.
# 3) Create a 'patches' directory somewhere where 'v' can find it. Or use this one.
# 4) ./v.py patches command
# e.g.,
# ./v.py patches w
#                  ^^ displays WoT
#  ./v.py patches p patches/asciilifeform_add_verifyall_option.vpatch asciis_bleedingedge
#                  ^^ this 'presses' (creates the actual tree)
#                  ^^ approximately like a 'checkout' in your vanilla flavoured shithub.

##############################################################################

import os, sys, shutil, argparse, re, tempfile, gnupg

##############################################################################
vver = 1001 # This program's Kelvin version.

## HOW YOU CAN HELP: ##

# * TESTS plox, ty!
#
# Report findings in #bitcoin-assets on Freenode.

##############################################################################

prolog =  '''\
(C) 2015 NoSuchlAbs.
You do not have, nor can you ever acquire the right to use, copy or distribute
this software ; Should you use this software for any purpose, or copy and
distribute it to anyone or in any manner, you are breaking the laws of whatever
soi-disant jurisdiction, and you promise to continue doing so for the indefinite
future. In any case, please always : read and understand any software ;
verify any PGP signatures that you use - for any purpose.2
'''

intro = "V (ver. {0}K)\n".format(vver)

##############################################################################
def toposort3(unsorted):
    sorted = []
    unsorted = dict(unsorted)
    while unsorted:
        acyclic = False
        for node, edges in unsorted.items():
            for edge in edges:
                if edge in unsorted:
                    break
            else:
                acyclic = True
                del unsorted[node]
                sorted.append((node, edges))
        if not acyclic:
            fatal("Cyclic graph!")
    return sorted
##############################################################################

verbose = False

def fatal4(msg):
    sys.stderr.write(msg + "\n")
    exit(1)

def spew5(msg):
    if verbose:
        print msg

# List of files in a directory, in lexical order.
def dir_files6(dir):
    return sorted([os.path.join(dir, fn) for fn in next(os.walk(dir))[2]])

# GPG is retarded and insists on 'keychain.'
# This will be a temp dir, because we don't do any crypto.
gpgtmp = tempfile.mkdtemp()
gpg = gnupg.GPG(gnupghome=gpgtmp)
gpg.encoding = 'utf-8'7

# Known WoT public keys.
pubkeys = {}

# The subset of vpatches that are considered valid.
patches = []

# Banners (i.e. vpatches mapped to their guarantors)
banners = {}

# Roots (i.e. vpatches parented by thin air)
roots = []

# Table mapping file hash to originating vpatch
desc = {}
desc['false' ] = 'false'

# Grep for diff magics, and memoize
def vpdata8(path, exp, cache):
    l = cache.get(path)
    if not l:
        l = []
        patch = open(path, 'r').read()
        for m in re.findall(exp, patch, re.MULTILINE):
            l += [{'p':m[0], 'h':m[1]}]
        cache[path] = l
    return l

# Get parents of a vpatch
pcache = {}
def parents9(vpatch):
    parents = vpdata(vpatch, r'^--- (\S+) (\S+)$', pcache)
    if not parents:
        fatal("{0} is INVALID, check whether it IS a vpatch!".format(vpatch))
    return parents

# Get children of a vpatch
ccache = {}
def children10(vpatch):
    children = vpdata(vpatch, r'^\+\+\+ (\S+) (\S+)$', ccache)
    if not children:
        fatal("{0} is INVALID, check whether it IS a vpatch!".format(vpatch))
    # Record descendents:
    for child in children:
        h = child['h']
        if h != 'false':
            desc[h] = vpatch
    return children

# It is entirely possible to have more than one root!
# ... exactly how, is left as an exercise for readers.
def find_roots11(patchset):
    rset = []
    # Walk, find roots
    for p in patchset:
        if all(p['h'] == 'false' for p in parents(p)):
            rset += [p]
            spew("Found a Root: '{0}'".format(p))
    return rset

# Get antecedents.
def get_ante12(vpatch):
    ante = {}
    for p in parents(vpatch):
        pp = desc.get(p['h']) # Patch where this appears
        if not ante.get(pp):
            ante[pp] = []
        ante[pp] += [p['p']]
    return ante

# Get descendants.
def get_desc13(vpatch):
    des = {}
    for p in patches:
        ante = get_ante(p)
        if vpatch in ante.keys():
            des[p] = ante[vpatch]
    return des

##############################################################################

# Print name of patch and its guarantors, or 'WILD' if none known.
def disp_vp14(vpatch):
    seals = ', '.join(map(str, banners[vpatch]))
    if seals == '':
        seals = 'WILD'
    return "{0} ({1})".format(vpatch, seals)

##############################################################################

# Command: WoT
def c_wot15(args):
    for k in pubkeys.values():
        print "{0}:{1} ({2})".format(k['handle'], k['fp'], k['id'])

# Command: Flow
def c_flow16(args):
    for p in patches:
        print disp_vp(p)

# Command: Roots.
def c_roots17(args):
    for r in roots:
        print "Root: " + disp_vp(r)

# Command: Antecedents.
def c_ante18(args):
    ante = get_ante(args.query)
    for p in ante.keys():
        if p != 'false':
            print "{0} [{1}]".format(disp_vp(p), '; '.join(map(str, ante[p])))

# Command: Descendants
def c_desc19(args):
    des = get_desc(args.query)
    for d in des.keys():
        print "Descendant: {0} [{1}]".format(disp_vp(d), '; '.join(map(str, des[d])))

# Command: Press.
def c_press20(args):
    print "Pressing using head: {0} to path: '{1}'".format(args.head, args.dest)
    headpos = patches.index(args.head)
    seq = patches[:headpos + 1]
    os.mkdir(args.dest)
    for p in seq:
        print "Using: {0}".format(disp_vp(p))
        os.system("patch -E --dir {0} -p1 < {1}".format(args.dest, p))
    print "Completed Pressing using head: {0} to path: '{1}'".format(args.head, args.dest)

# Command: Origin.
def c_origin21(args):
    o = desc.get(args.query)
    if o:
        print disp_vp(o)
    else:
        print "No origin known."

##############################################################################

##############################################################################
# Command line parameter processor.22
parser = argparse.ArgumentParser(description=intro, epilog=prolog)

# Print paths, etc
parser.add_argument('-v', dest='verbose', default=False,
                    action="store_true", help='Verbose.')

# Permit the use of patches no one has yet sealed. Use this ONLY for own dev work!
parser.add_argument('-wild', dest='wild', default=False,
                    action="store_true", help='Permit wild (UNSEALED!) vpatches.')

# Glom keyid (short fingerprint) onto every WoT handle.
parser.add_argument('-fingers', dest='fingers', default=False,
                    action="store_true", help='Prefix keyid to all WoT handles.')

# Default path of WoT public keys is /home/yourusername/.wot
# This dir must exist. Alternatively, you may specify another.
parser.add_argument('--wot', dest='wot', default=os.path.join(os.path.expanduser('~'), '.wot'),
                    action="store", help='Use WoT in given directory. (Default: ~/.wot)')

# Default path of the seals (PGP signatures) is /home/yourusername/.seals
# This dir must exist. Alternatively, you may specify another.
parser.add_argument('--seals', dest='seals', default=os.path.join(os.path.expanduser('~'), '.seals'),
                    action="store", help='Use Seals in given directory. (Default: ~/.seals)')

# REQUIRED: Path of directory with vpatches.23
parser.add_argument('vpatches', help='Vpatch directory to operate on. [REQUIRED]')

# REQUIRED: Command.24
subparsers = parser.add_subparsers(help='Command [REQUIRED]')

parser_w = subparsers.add_parser('w', help='Display WoT.')
parser_w.set_defaults(f=c_wot)

parser_r = subparsers.add_parser('r', help='Display Roots.')
parser_r.set_defaults(f=c_roots)

parser_a = subparsers.add_parser('a', help='Display Antecedents [PATCH]')
parser_a.set_defaults(f=c_ante)
parser_a.add_argument('query', action="store", help='Patch.')

parser_d = subparsers.add_parser('d', help='Display Descendants [PATCH]')
parser_d.set_defaults(f=c_desc)
parser_d.add_argument('query', action="store", help='Patch.')

parser_l = subparsers.add_parser('f', help='Compute Flow.')
parser_l.set_defaults(f=c_flow)

parser_p = subparsers.add_parser('p', help='Press [HEADPATCH AND DESTINATION]')
parser_p.set_defaults(f=c_press)
parser_p.add_argument('head', action="store", help='Head patch.')
parser_p.add_argument('dest', action="store", help='Destionation directory.')

parser_o = subparsers.add_parser('o', help='Find Origin [SHA512]')
parser_o.set_defaults(f=c_origin)
parser_o.add_argument('query', action="store", help='SHA512 to search for.')

##############################################################################

# V cannot operate without vpatches, WoT, and Seals datasets.
def reqdir25(path):
    if (not (os.path.isdir(path))):
        fatal("Directory '{0}' does not exist!".format(path))
    return path

def main26 ():
    global verbose, pubkeys, patches, roots, banners

    args = parser.parse_args()
    verbose = args.verbose

    # Patch and Sigs dirs
    pdir = reqdir(args.vpatches)
    sdir = reqdir(args.seals)
    wdir = reqdir(args.wot)

    spew("Using patches from:" + pdir)
    spew("Using signatures from:" + sdir)
    spew("Using wot from:" + wdir)

    pfiles = dir_files(pdir)
    sfiles = dir_files(sdir)
    wfiles = dir_files(wdir)

    # Build WoT from pubkeys
    handle = {}
    for w in wfiles:
        pubkey = open(w, 'r').read()
        impkey = gpg.import_keys(pubkey)
        for fp in impkey.fingerprints:
            handle[fp] = os.path.splitext(os.path.basename(w))[0]

    for k in gpg.list_keys():
        name = handle[k['fingerprint']]
        if args.fingers:
            name += '-' + k['keyid']
        pubkeys[k['keyid']] = {'fp':k['fingerprint'],
                               'id':', '.join(map(str, k['uids'])),
                               'handle':name}

    # Validate seals
    for p in pfiles:
        pt = os.path.basename(p)
        banners[p] = []
        for s in sfiles:
            sig = os.path.basename(s)
            # All seals must take the form patchtitle.vpatch.yourname.sig
            if sig.find(pt) == 0: # substring of sig filename up through '.vpatch'
                v = gpg.verify_file(open(s, 'r'), data_filename=p)
                if v.valid:
                    banners[p] += [pubkeys[v.key_id]['handle']]
                else:
                    fatal("---------------------------------------------------------------------\n" +
                          "WARNING: {0} is an INVALID seal for {1} !\n".format(sig, pt) +
                          "Check that this user is in your WoT, and that this key has not expired.\n" +
                          "Otherwise remove the invalid seal from your SEALS directory.\n" +
                          "---------------------------------------------------------------------")

    # Select the subset of vpatches currently in use.
    for p in pfiles:
        if banners.get(p) or args.wild:
            patches += [p]
            children(p) # Memoize.
            parents(p) # Memoize.

    roots = find_roots(patches)
    if not roots:
        fatal('No roots found!')

    # Topological ordering of flow graph
    l = []
    for p in patches:
        l += [(p, get_desc(p).keys())]
    s = map(lambda x:x[0], toposort(l))
    patches = s[::-1]

    # Run command
    args.f(args)

    # Remove temporary keychain
    shutil.rmtree(gpgtmp)

##############################################################################

if __name__ == '__main__' :
    main()

##############################################################################
  1. In Kelvin versioning subsequent version numbers decrease until one has working code, hopefully before version 0. Instead of growing indefinitely like a tumor, software written using this versioning system is meant to crystallize into a program that does one job well. []
  2. This is the software liscense of ~TMSR. The subtext is software is owned by the person running it. []
  3. This is Dijkstra's algorithm for topological sort. I wrote my own version here. Stan starts with an unsorted dictionary of nodes mapped to their outgoing edges. The function iterates through unsorted until it is empty. On each iteration, if a node has no outgoing edges that point to a node in unsorted, the node is appended to the list sorted. If the toposort goes through this iteration without finding a leaf node, then the graph must be cyclic and the toposort fails. Note that Stan's toposort returns the reverse of what is conventionally returned by topological sort. []
  4. A simple function that takes a msg and writes it to standard error while existing the program returning a 1 symbolizing the program failed. []
  5. Prints the given message if verbose is set to true. []
  6. The comment for this function describes it perfectly. os.walk will return an iterator that yields a value for every sub directory. The magical 2 index is the list of file names inside of a given directory in the directory tree. []
  7. ben_vulpes describes why the three lines above are done. gpg by default saves the state of imported keys. For V, however, we only want to use the keys that are in the wot directory at the time of running a command. Setting up this temporary directory is thus done as an implementation detail for running gpg with only the keys found in the wot dir. []
  8. This is an auxilary method used to find children and parents of vpatches. Its parameters include a path to a vpatch and a regular expression. The regex is used to match to the lines found in a vpatch that look like:
    +++ filepath filehash
    or
    --- filepath filehash

    The third parameter, cache, is the dictionary used for memoization. []
  9. Returns a list of the parents of a vpatch using vpdata. Returns an error if it finds none. It's important to note that a parent refers to the filepath/filehash pair that is found after the "---" lines in a vpatch. They are not to be confused with antecedents, which are vpatches. The same goes for children, which are the filepath/filehash pairs found after "+++" lines in a vpatch. []
  10. Returns a list of the children of a vpatch. After finding the children, the function maps the child's filehash to the vpatch in the dictionary desc. It does not count children who have a filehash of 'false' since those are files that have been deleted. []
  11. Filters a given list of patches for the ones that are roots. A genesis patch is a root, but a root is not necessarily a genesis patch. Stan leaves an exercise for the reader to show how one could have more than one root. The answer: a vpatch set will have multiple roots if there are at least two vpatches that do not modify or delete any existing files and instead only create files. []
  12. Iterates through the parents of a vpatch, and then finds the corresponding vpatch that introduced the files in the parents. Returns a dictionary mapping the vpatch's antecedents to the filepath's of the intersection of the antecedent's children and the vpatch's parents. []
  13. Iterates through all patches, getting their antecedents. If the given vpatch is in the antecedents of another patch, then that patch is a descendant of the given vpatch. Returns a dictionary mapping the vpatch's descendants to the filepath's of the intersection of the vpatch's children and the descendant's parents. []
  14. Self explanatory. Although, the comment is technically incorrect- the function just returns the string with the name of the patch and its guarantors / wild if unknown, it does not print the string. []
  15. Prints the list of the keys in the wot directory. []
  16. Prints all the patches in the patch directory with their guarantors as per disp_vp. []
  17. Prints all the roots. These are usually only genesis patches, but could technically be any patch that doesn't modify/delete any existing files. []
  18. Takes a vpatch and prints all of its antecedents. []
  19. Takes a vpatch and prints all of its descendants. []
  20. Presses to a given head vpatch and outputs the result in a given directory. Applies the already topologically sorted vpatches in order up to and including the given head vpatch. If the head is in a vtree that diverges, there is no guarentee that a divergent branch that does not include the head will be pressed as well. This may be considered a bug; vpy needs to be used with care. []
  21. Takes a filehash and prints the vpatch that generated the corresponding file or "No origin known.". []
  22. The code in this section sets up the command line parameter processor. First Stan sets up the optional arguments: verbose, wild, fingers, wot, and seals. The add_argument command used for this takes the parameters: dest, a string that will determine how to access the argument; default, the default value for the parameter; action, either sets the dest to true or sets the dest to the argument given; help, gives a description of the optional parameter.
    -verbose makes the code liberally print what it's doing, -wild lets one patch with unsigned patches, -fingers appends a short fingerprint to every wot handle, --wot takes a path to a directory of wotkeys to replace the default /home/username/.wot, --seals takes a path to a directory of signatures of vpatches to replace the default /home/username/.seals []
  23. Stan's code requires you provide the path to the vpatches on every run, other implementations I've seen have a default patches directory just like .wot and .seals. []
  24. Sets up the list of commands. These fire off the functions that have a prefix c_ in their name, their jobs are described above. The commands are: w (displays WOT), r (displays Roots), a (displays the antecedents of a given vpatch), d (displays the descendants of a given vpatch), f (displays the patches topologically sorted), p (presses a vtree), o (gets the vpatch that generated the given filehash) []
  25. Fails the program if any of the vpatches, wot, or seals directory do not exist. []
  26. The main function stores the arguments from the parser, and then does various setup such as loading the vpatches while checking for their seals and doing the topological sort on the vpatches. Once done initializing it runs the command given by the user and then cleans up by removing the temporary directory it had to make for gnupg. []

Vpatch Study Part 2 - Topological Sort Example

Friday, October 18th, 2019

I have written a program that topologically sorts a list of python tuples representing vpatches1 as an exercise to understand V. A few tests are included. From this exercise I learned that I don't know exactly what it means for a vpatch to be a parent of another vpatch. I hope to resolve this point of confusion shortly.

I do not plan to continue working on this program unless otherwise directed. My next step is to publish my understanding of the what and why of V, followed by my annotations of asciilifeform's v.py.

  1. The datatype has the minimum information I believe is necessary to be able to perform a topological sort. []

V Study Part 1 - Vpatches and Vdiff

Tuesday, October 15th, 2019

Creating source using V is done by sequentially applying a set of vpatches through a process known as pressing. To press, V is given the most recent vpatch and an output directory. V then finds a path from the given vpatch to the genesis vpatch. Starting with the genesis vpatch, V applies each vpatch along the found path and dumps the result into the given output directory. In this post I go over how the vpatches used in this process are created.

To make a vpatch, a developer starts with a copy of the source already pressed to the previous most recent vpatch. We'll say for example this source is in a directory named oldversion. The developer then copies the source in oldverison to another directory that we'll call newversion. In the directory newversion he makes the source modifications that will constitute the vpatch. When finished, the developer runs

vdiff oldversion newversion

An example of the code for the vdiff program, taken from the bitcoin foundation, is reproduced in one line below:


diff -uNr $1 $2 | awk 'm = /^(---|\+\+\+)/{s="sha512sum \"" $2 "\" 2>/dev/null  " | getline x; if (s) { split(x, a, " "); o = a[1]; } else {o = "false";} print $1 " " $2 " " o} !m { print $0 }'

Running vdiff on the two directories creates the vpatch file, which is similar to a diff file obtained from running

diff -uNr oldversion newversion

The difference is vdiff replaces vanilla diff's file modification timestamps with hashes1 of the file's content. spyked articulates the importance of this in a recent thread he had with me in #o.

spyked: whaack, problem is that classical diff/patch leave room for ambiguity, i.e. in principle it's possible to (cleanly) apply a single hashless patch to different files, which results in different presses. so hashes are needed in order to identify the file (not only path/name, which is only metadata required for retrieval) as it is before/after applying the patch.

I still need to fully digest the awk command that is replacing the file timestamps with the file content hashes. But one quirk I noticed was that certain crafted files would cause the awk command to incorrectly match on certain lines. For example if you have


$tree
.
├── newversion
│   └── fool.txt
├── oldversion

$cat newversion/fool.text
++ trick.txt this_should_be_in_the_vpatch2

then

$vdiff oldversion newversion

will produce


diff -uNr oldversion/fool.txt newversion/fool.txt
--- oldversion/fool.txt false
+++ newversion/fool.txt 27991f54fb2534c59b6c0667f9a91d8bd9173b5cc3184aeea251c2435b7808457a5492add5646793738a1f3e9c32892a2261e18eb0e3a2d0d7a0486735bf43a8
@@ -0,0 +1 @@
+++ trick.txt false

the last line should be

+++ trick.txt this_should_be_in_the_vpatch

but it was mistakenly altered by the awk command. This incorrect modification to the vpatch makes the resulting fool.txt file have the wrong contents after pressing.3 However if, while pressing, V checks that the hashes of the resulting files match the intended files hashes found in the vpatch, V will correctly spot this error and fail to press. This gives an example of how diana_coman was right when responding to my point of confusion here

whaack: got it, i understand that the hashes are needed to identify the files. but regarding hashing the files yourself after every patch, the vpatches already let you know what the output hash will be. so if you trust the vpatch to the point where you're going to run the code outputted by it, then you should trust its claim of what the output of the hash would be. hashing the output files yourself after every patch then becomes more of a
whaack: data-integrity check.
diana_coman: the vpatches let you know what the output hash *should be*
diana_coman: nobody can let you know upfront what it *will be*; in general

  1. Originally, the hash function used in vdiff was sha512, as I have in the vdiff program I posted. Now the hash function used is keccak. The benefits of using keccak over sha512 are beyond the scope of both me and the post. []
  2. Note that I put two +'s at the start of the one line in this file. To show this line was added, the diff command's output will contain a "+" followed by the line's contents. This will cause there to be a line in the diff output with three sequential +'s that refers to a file's content. The awk command will incorrectly match to this line and attempt to replace this_should_be_in_the_vpatch with the hash of the non existent file trick.txt. []
  3. "this_should_be_in_the_vpatch" was replaced with the word "false" because the hash of the file "trick.txt" does not exist. []

V Study Reference Links

Tuesday, October 15th, 2019

My master diana_coman has assigned me the task of creating a report on my understanding of V that includes a v.py with my own annotations. I have collected related links for my reference as well as for the reference for anyone else on a similar path. I will update this post as I locate more material.

  1. ben_vulpes's V-tronics 101: A gentle introduction to The Most Serene Republic of Bitcoin's cryptographically-backed version control system 1
    1. mp's ode to V / ode to Genesis patch2
    2. mp's v manual genesis.
    3. mod6's perl v
  2. esthlos's a v-tron, a cl V he made himself. He has many updates to this vtron that are found on his homepage.
  3. asciilifeform's v.py3
  4. A note from spyked's blog about the shortcoming of v.py.
  5. trinque's v-manifest spec draft4
  6. mp's A new software licensing paradigm
  1. This post has links to other useful material some of which I include on this post as well. []
  2. link dead at time of writing mod6 has fixed the link. []
  3. This was written by asciilifeform but I only have found phf's signature. There is a patch by phf to have asciilifeform's v.py use vtools. []
  4. Linking archived version until trinque recovers his blog. []

Recovery

Thursday, October 10th, 2019

The little hut known as ztkfg.com was engulfed by the flames of the recent fire within tmsr. This blog now temporarily lives in enemy territory while I wait for republic lands to return to their inhabitable state.

Previous blog posts should be restored shortly.

UPDATE: Previous posts have been salvaged from the fire.1

  1. Thanks to BingoBoingo saving my ass with the help of trinque. []

Past TMSR Work, Potential Future TMSR Work

Sunday, September 22nd, 2019

My contributions to the republic, while having spent years twiddling my thumbs reading the logs, are as follow:

1. 5 Qntra posts, found here: 1

My first post was an inside perspective of MIT's "blockchain" curriculum. It confirmed what the republic already knew, namely that there was no interesting work going on at MIT re bitcoin, and any "work" being done there was hostile towards republican interests. Two other posts were tabloidal, making fun of pantsuitism. And the last two posts were reports on Coinbase shenanagins during the bitcoin cash hard fork.

2. Setting up my own trb bitcoin node. (failed)

There was naivety in my attempts to setup a running bitcoin node. When I first attempted to setup a node, I tried to get it goig on an old unused laptop. One mistake was believing that 2gb of ram is enough to get a timely block sync. I had thought at the time that the only bottleneck to getting a node up to speed was downloading the blocks, and I did not intuit the time it takes to locally verify all the blocks along the way2 I later attempted to sync a node on a dedicated machine hosted by dreamhost.com,3 paying a little over $100 usd per month. I can't quite recall what happened, but I think around block 350,000 it got stuck. Later, without trying to reboot bitcoind, I decided to cut my expense with dreamhost and gave up on running a full node.

3. Researching how many bitcoins are tied up in P2SH4 (failed)

The goal was getting an upper bound of how many coins are in anyone-can-spend scripts in order to answer the question: how many coins are in addresses related to segwit?

To do this, I first used ben_vulpes's block explorer5 to grab sexprs containing the data for every block. This was obtained by looping from 0...max_block_height and running
wget -0 http://mimisbrunnr.cascadianhacker.com/blocks/blk{n}
where {n} was the block number. While I was running a loop performing this task I noticed that occasionally ben's block explorer would give me some malformed file - and I had to simply re-wget the same url until i got back a properly formatted sexpr. It took a while to download all the blocks from Ben (even though I was not verifying them) and so I paid for a digitalocean droplet to run my scraper script on.6

Once I had blocks 0..n, I ran a script7 that would go through a chunk of blocks and keep an ongoing hashmap mapping "(txn hash, output number) -> num_satoshis_sent_to_output" for all the outputs in the block that were sent to non-trb conforming addresses. For each new block, the script would first iterate through the txns in the block to see if any of them spent the coins in the ongoing hashmap obtained from all the previous blocks. If a txn in the new block consumed one of the P2SH UTXOs that was being stored, that UTXO would be deleted from the ongoing hashmap. Once the purge of transaction outputs that had just been spent was completed, the script reiterated through the new block's txns to add any txn outputs that were directed to non-trb P2SH's to the ongoing hashmap. After iterating through all the blocks, one could calculate how many satoshis were in non-trb addresses by summig up all the values in the obtained hashmap.

I don't recall at what point/why I just faded away and stopped working on this tool. It may have been because I hit a problem with running out of memory for storing all the segwit UTXOs. It was an interesting investigation and perhaps the republic would still find a counter of coins that are contained in non-trb addresses useful. Which brings us to part two of this post:

Potential Future TMSR Work

diana_coman: whaack_pura_vida: that8 is obsolete so not a lot of help in itself; nobody is going to make the list ready for you to pick and choose, wtf.
diana_coman: whaack_pura_vida: publish what you figure out by Sunday together with *how you went about* the figuring out

My initial internal response to diana_coman was " (1) why is that a ridiculous expectation since there previously was a list of entry points? and (2) how is that list obsolete if a young hand such as shrysr is digesting V, which is more or less a task on that list?"

The best answer I can come up with to my own questions are "Yes, a list was once generously made, but doesn't mean that lords have time to keep an up to date task list for noobs. That post was THREE YEARS AGO and now there are new tasks to do - which you must find yourself. The current task list may or may not coincide with the three year old post, you have to have read the logs to find out."

With that being said, and keeping with the "how you went about figuring out", here is a list of potential tasks, with an annotation denoting how/why I came to choose that task.

1. Creating my own V9
2. Related to 1, taking up the task of maintaining a vpatch viewer10
3. Creating a new trb block explorer11
4. Continuing fighting the war on Segwit, first by completing the task of sizing up the coins held in P2SH. 12
5. Learning ADA and completing Stan's FFA series.13

Tasks (3) and (4) seems the most interesting to me, but I believe the v-related tasks (1) and (2) should be my starting point.

  1. These were edited by BingoBoingo, and one was improperly formatted wthen sent to him. So these contributions may have even been net negative depending on how much time BingoBoingo had to spend to correcret my mistakes. []
  2. My intuition was likely skewed because my first experience with running a bitcoin node was using power ranger software which used SPV, effectively making my computer search for the longest chain instead of the longest valid chain. []
  3. Originally I had thought, what help is it to run a node on someone else's iron? I still believe it is not that useful, you are only temporarily increasing the redundancy of the bitcoin network, but at any moment the enemy can flip a switch and you go offline. Adding a node to pizarro also has dubious utility, because from my understanding the republic already has a few nodes there at 161.0.121.248 and 161.0.121.250. []
  4. pay to script hash []
  5. dead at time of writing []
  6. This was also discontinued when I did some cleaning out of expenses. And I wiped everything off the droplet without first taking a local copy. []
  7. A lot of CL weird and sloppy code. Some of it is copy and pasted from code Ben was using to analyze his own block explorer. []
  8. http://trilema.com/2016/how-to-participate-in-the-affairs-of-the-most-serene-republic/ []
  9. The initial idea was planted by the trilema post of entry to affairs. That being said, V seems a natural starting point for working with republican code. It demonstrates understanding of the tool required to publish and use any code in the republic []
  10. per the suggestion of trinque. []
  11. Found this may be useful by going through old tasks and noting that Ben's old block explorer mimisbrunnr had died. []
  12. I figure that the lords best spend their time fortifying their castle walls rather than going out to fight against nonsense like Segwit. But perhaps a noob could prove his worth by taking on this neglected task. []
  13. This task seems a useful start for the same reason the V tasks seem useful: to prepare a young hand by learning the tools used to contribute to the republic. In addition, from my understanding only a few have gone through any of Stan's series. But apart from the additional proofread, this is a personal development goal rather than a contribution. []