department of hack
888 stories

the knack

1 Share

The thing I hated most about being a child in the 1950s was that you couldn’t just open the cupboard. There was a knack to it. There was a knack to everything. Nothing fitted. Nothing worked. Nothing did what it was designed to do without some further persuasion, the application–the added value–of massively embedded and localised knowledge you didn’t have. There was a knack to opening it and a knack to closing it again. There was a knack to winding it up, or getting it to actually supply electricity to the bulb or stick to the inner tube. It was just a knack. There was a knack to getting it started in the morning. If you didn’t have the knack, you were already in arrears. Your place in the hierarchy was low. The reproducible was, for you, non-reproducible. This wasn’t just some door your father had bodged, some deflated bike tyre that would inflate for everyone in the family but you. It turned out to be everything. Entire factories–transport infrastructures–entire industries–depended on people who had got the knack of not very well designed, not very reliable machinery. School was learning that to “to learn” meant to learn the knack. There were apprenticeships that taught only the knack, and indeed knackism. Entire disciplines–like toolmaking and all kinds of assembly–were run by instinct and by eye; they were run on, fucked up and then solved by the knack. In fact they had been devised to run on the knack; it was built-in, it was the crap code that lay underneath everything. Once you got the knack of it, you were fine. Until I found language I never got a knack. But I expect you knew that. A weird side effect of growing up without the knack was that I came to loathe even slightly broken or inefficient stuff and now have difficulty keeping it near me. I understand the problematics of throwaway tech, but I’m afraid understanding them won’t cure the neurosis. Next: rationing, especially of chocolate.

Read the whole story
3 days ago
Boulder, CO
Share this story

webshit weekly

1 Comment

An annotated digest of the top "Hacker" "News" posts for the second week of October, 2018.

Shutting Down Google+ for Consumers
October 08, 2018 (comments)
Google left your shit out in the rain, and has prepared an interpretive security dance to distract you. Hackernews solemnly praises the terrible unwanted trash product at the center of the latest mishap, and writes some fanfiction about which other trash products might copy parts of Google's failed attempt.

How to Get Things Done When You Don't Feel Like It
October 09, 2018 (comments)
A bureaucrat pontificates about getting work done when you don't care about it. All of the suggested approaches are based on pop psychology and buzzwords; the term 'self-discipline' does not occur once in the entire article. This omission makes it extremely attractive to Hackernews, who gleefully detail all of the grotesque habits they've ritualized in pursuit of the ability to emulate fully-functional human beings. The party continues until one weirdo shows up and complains that the only successful approach is engaging with other human beings, so Hackernews convenes an intervention panel to diagnose what disgusting malfunction could possibly have led to his bizarre behavior.

Microsoft Joins the Open Invention Network
October 10, 2018 (comments)
Microsoft sneaks into the henhouse. Hackernews is torn between excitement at the prospect of being able to clone software they wrote at their last jobs and a creeping unease whenever the memory surfaces of the last eight hundred times someone tried to cooperate with Microsoft. The former group is pleased with the extensive list of half-assed standards they are now free to port to node.js, and the latter group starts a fistfight about some guy who landed in the pokey for selling copied Windows discs.

Astronauts escape malfunctioning Soyuz rocket
October 11, 2018 (comments)
The Soyuz campaigns to be renamed Pаспускать. Hackernews has nothing of value to contribute to this event, so they spend the afternoon constructing narratives of the proceedings based on Twitter posts. When that gets dull they start mining Wikipedia for trivia to report in the manner of baseball commentators reading player stats during a slow game.

Every Byte of a TLS Connection Explained and Reproduced
October 12, 2018 (comments)
An Internet documents a commonly-used protocol, then shows up in the comments to announce the use of a CDN to serve a single static page of HTML. The vote-to-comment ratio on this article is in excess of ten to one, which means Hackernews bookmarked this page but has not yet actually read it.

Teach Yourself to Echolocate: A beginner’s guide to navigating with sound
October 13, 2018 (comments)
An Internet has a plan to make children even more noisy and clumsy than they already are. Interpol is dispatching teams to haul the author back to The Hague to answer for this crime. Hackernews takes a break to reminisce about old websites, trading links to a few on the grounds that there is no search engine worth a shit. The rest of the Hackernews discuss how important hearing is, as though that is surprising information which needs explicit mention. A few Hackernews are extremely excited about date calculations.

How I’ve Attracted the First 500 Paid Users for My SaaS
October 14, 2018 (comments)
A webshit announces a breakthrough plan to acquire customers: talk to people and find out what they want, then sell it to them. Hackernews scoffs at this naive and ridiculous approach. They don't have any real reason to believe it can't work, but this is the only thinkpiece advocating it, so it is Obviously Wrong. The author shows up and only engages with Hackernews asking productive questions, which further enrages the rest. Buried within the bottom third of the comment page are the posts from other people who have taken similar approaches and met with success. Nobody replies.

Read the whole story
7 days ago
"Google left your shit out in the rain, and has prepared an interpretive security dance to distract you."
Boulder, CO
Share this story

The object of desire

1 Comment

Yesterday, I began to hear rumors that something was out in the world. My first clue was a congratulatory note from my agent in New York, who sent me an email with the subject line: “It’s a book!” The message itself was blank, except for a picture of his desk, on which he had propped up the hardcover of Astounding. A few hours later, I saw an editor for a pop culture site post the image of a stack of new books on Twitter, with mine prominently displayed about a third of the way from the bottom. In the meantime, there wasn’t any sign on it on my end—I hadn’t even seen the finished version yet. (I signed off on the last set of proofs months ago, and I’ve spent an inordinate amount of time admiring the cover art, but that isn’t quite the same as holding the real thing in your hands.) When the mail came that afternoon, there was nothing, so I figured that it would take another day or two for any shipment from my publisher’s warehouse to make it out to Chicago. In the evening, I headed out to the city, where I was meeting a few writers for dinner before our event at Volumes Bookcafe. When one of my friends arrived at the restaurant, he announced that he had heard a thud on his doorstep earlier that day, and he proudly pulled out his personal copy of the hardcover, from which he had prudently removed the dust jacket. At this point, I was starting to suspect that everybody in America would get it before I did, and when I arrived at the bookstore, I was genuinely shocked to see a table covered with copies of the book, which doesn’t officially come out until October 23. And although I should have been preparing for my reading, I took a minute to carry one into a quiet corner so that I could study it for myself.

Well, it definitely exists, and it’s just as beautiful as I had hoped. As a writer, I don’t have any control over the visual side, but the artist Tavis Coburn and the designers Ploy Siripant and Renata De Oliveira did a fantastic job—I’m obviously biased, but I don’t think any book about science fiction has ever come in a nicer package. The fact that I managed to get the hardcover version out into the world before physical books disappeared entirely is a source of real pride, and I look forward to seeing copies of it in thrift stores and cutout bins for years to come. And while I can’t speak to the contents, at first glance, they seemed perfectly fine, too. After the reading, which went well, I made my first sale of Astounding ever in a bookstore, and as I signed all the remaining copies that the store had on hand, I was sorely tempted to buy one for myself. I sent a picture of the stack on the display table to my wife, who texted back immediately: “Your copies came! One big box and one small one.” An hour or so later, I was back home, where I sliced open the first carton, then the second, to reveal my twenty-five author’s copies. (I’ll keep three for myself and gradually start to send the rest to various deserving recipients.) Now it’s the following morning, and the book is inexorably starting to assume the status of a familiar object. It’s lying at my elbow as I type this, and I can already feel myself taking it for granted. I suppose that was inevitable. But I’ll always treasure the memory of the day in which everyone I knew seemed to have it except for me.

Read the whole story
10 days ago
Really looking forward to this one.
Boulder, CO
Share this story

Dear People Trying to Store Energy via Gravity: Please Stop

1 Share

tl;dr– People who suggest suspending weights as a viable energy storage scheme are either fools or charlatans.

Alternative energy. So hot right now.

So hot, in fact, that a common meme lately has been the idea of getting energy via gravity – that is, harnessing the power of slowly falling suspended weights. After all, it works for hydropower, and that’s about as green as energy can get. Sadly, while these armchair inventors’ motives are admirable, their effort could have been better spent learning some basic physics. Gravity simply doesn’t store very much energy.

Basic Physics

The amount of energy you can store with gravity is dictated by , where:

mass suspended
strength of gravity
height of the drop

Pretty basic, right? As you will notice, none of these numbers is particularly large.


For lighting, the second consideration is how efficiently you can convert energy into visible light. This conversion factor (called luminous efficacy) is expressed as lumens per watt. State-of-the-art LEDs have luminous efficacies between 100 (readily available on the market) to 200 (exotic laboratory prototypes) lm/W. The maximum possible luminous efficacy is 683 lm/W, at which point all energy is being converted to light – pure green light at that (why? because human eyes are most sensitive to green, so concentrating all energy there gives most bang for the visibility buck). The theoretical maximum for light that could be perceived as ‘white’ is ~370 lm/W.

How bright is a lumen? A 60-watt incandescent bulb shines about 800 lumens. A “standard candle” is 12 lumens.

Stored energy combined with a luminous efficacy yields a total “light energy” measured in lumen-seconds, wherein you can trade off a brighter light (more lumens) for a shorter time (fewer seconds) vs. a dimmer light for a longer time.

Also, for the sake of simplicity and benefit of the doubt, all calculations assume zero mechanical losses. Real-world yields would be lower.

The Hall of Shame

Gravia Lamp

Ah, where it all began, the Gravia lamp. Now it may seem unfair to pick on a 6-year old, already thoroughly debunked project, but for the sake of completeness (and the sheer egregiousness of its claims) I will include it here.

A fifty-pound weight. Raised 58 inches. Advertised as lighting your living room for four hours.

50 lb × 58 in × 9.8 m/s2 ÷ 4 hr = 22.7 mW. Even assuming some exotic alien technology that gives us the theoretical maximum efficacy allowed by the laws of physics, you get… 16 lumens, or about one candle’s worth of harsh, green light.

The amount of press this got was frankly embarrassing.


Up next, GravityLight. A similar concept: lift up a weight and generate light as it falls… with added cachet of helping the poor in Africa.

Assuming twenty pounds of rocks and a two-meter drop we get 20 lb × 2 m × 9.8 m/s2 = 178 J of stored energy. Add a 95 lm/W high-efficiency LED and we get 16,900 lm·s of light. This could deliver the brightness of a 60-watt incandescent for a whopping 21 seconds. Dial it down to their advertised running time of thirty minutes, and you get 9 lumens. Again, about one candle.

Now I admit I’ve come around a bit on these guys. The light is certainly not as bright as many expected, and the hype writes checks that the product can’t quite cash. But my keychain flashlight has a low-power mode of ten lumens, and I have to say it is sufficient for reading or other household tasks performed in a smallish room, especially once your eyes adjust. And for its intended market of the third world, that is certainly better than darkness.

But not exactly the Hanukkah miracle.

Gravity Batteries

At last, Gravity Batteries. This was the one that made me snap and write this post. Unlike the others, it’s not for lighting but rather general energy storage. As such, at least it doesn’t make the same mistake of suggesting it should be human powered.

But as should be clear by now, the energy density of gravity is just ridiculously low compared to practically any other technology we have. It works for hydroelectric plants because they have huge reservoirs to supply them.

No hard numbers are provided on the size of these gravity batteries, but let’s assume one uses a 100 m deep shaft with a counterweight the mass of a Cadillac Escalade. The energy stored is 100 m × 2,700 kg × 9.8 m/s2 = 2.6 MJ (0.7 kWh). The first deep-cycle battery I found through ten seconds of googling (retail price $260) has a capacity of 90 Ah. 90 Ah × 12 V × (80% discharge cycle) = 3.1 MJ. It just doesn’t add up!

Now I can’t personally speak to the logistics of building a shaft taller than most elevators, hoisting a weight in it heavier than most elevators, oh and btdubs did I mention this shaft is going into the ground where you have to deal with water seepage, corrosion, extreme temperature gradients…, but I’m sure it costs a hell of a lot more than $260.

Gravity energy storage only makes sense when your reaction mass is on the order of cubic miles.

Here’s another way to think about it: if gravity had a comparable energy density to chemical energy, you’d metabolize half your body mass simply climbing up a hill.

Read the whole story
12 days ago
Boulder, CO
Share this story

Go 1.11 got me to stop ignoring Go

1 Share

I took a few looks at Go over the years, starting who knows when. My first serious attempt to sit down and learn some damn Go was in 2014, when I set a new personal best at almost 200 lines of code before I got sick of it. I kept returning to Go because I could see how much potential it had, but every time I was turned off for the same reason: GOPATH.

You see, GOPATH crossed a line. Go is opinionated, which is fine, but with GOPATH its opinions extended beyond my Go work and into the rest of my system. As a naive new Go user, I was prepared to accept their opinions on faith - but only within their domain. I already have opinions about how to use my computer. I knew Go was cool, but it could be the second coming of Christ, and so long as it was annoying to use and didn’t integrate with my workflow, I (rightfully) wouldn’t care.

Thankfully Go 1.11 solves this problem, and solves it delightfully well. I can now keep Go’s influence contained to the Go projects I work with, and in that environment I’m much more forgiving of anything it wants to do. And when considered in the vacuum of Go, what it wants to do is really compelling. Go modules are great, and probably the single best module system I’ve used in any programming language. Go 1.11 took my biggest complaint and turned it into one of my biggest compliments. Now that the One Big Problem is gone, I’ve really started to appreciate Go. Let me tell you about it.

The most important feature of Go is its simplicity. The language is small and it grows a small number of features in each release, which rarely touch the language itself. Some people see this as stagnation, but I see it as stability and I know that very little Go code in the wild, no matter how old, is going to be unidiomatic or fail to compile. Even setting aside stability, the conservative design of the language makes Go code in the wild remarkably consistent. Almost all third-party Go libraries are high quality stuff. Gofmt helps with this as well1. The limitations of the language and the way the stdlib gently nudges you into good patterns make it easy to write good Go code. Most of the “bad” Go libraries I’ve found are trying to work around Go’s limitations instead of embracing them.

There’s more. The concurrency model is superb. It should come as no surprise that a language built by the alumni of Plan 9 would earn high marks in this regard, and consequently you can scale your Go program up to be as concurrent as you want without even thinking about it. The standard library is also excellent - designed consistently and designed well, and I can count on one hand (or even one finger) the number of stdlib modules I’ve encountered that feel crusty. The type system is great, too. It’s the perfect balance of complexity and simplicity that often effortlessly grants these traits to the abstractions you make with it.

I’m not even slightly bothered by the lack of generics - years as a C programmer taught me not to need them, and I think most of the cases where they’re useful are to serve designs which are too complicated to use anyway. I do have some complaints, though. The concurrency model is great, but a bit too magical and implicit. Error handling is annoying, especially because finding the origin of the error is unforgivably difficult, but I don’t know how to improve it. The log module leaves a lot to be desired and can’t be changed because of legacy support. interface{} is annoying when you have to deal with it, like when dealing with JSON you can’t unmarshall into a struct.

My hope for the future of Go is that it will continue to embrace simplicity in the face of cries for complexity. I consider Go modules a runaway success compared to dep, and I hope to see this story repeated2 before hastily adding generics, better error handling, etc. Go doesn’t need to compete with anyone like Rust, and trying to will probably ruin what makes Go great. My one request of the Go team: don’t make changes in Go 2.0 which make the APIs of existing libraries unidiomatic.

Though I am growing very fond of it, by no means am I turning into a Go zealot. I still use C, Python, and more all the time and have no intention of stopping. A programming language which tries to fill all niches is a failed programming language. But, to those who were once like me: Go is good now! In fact, it’s great! Try it!

  1. I have minor gripes with gofmt, but the benefits make up for it beautifully. On the other hand, I have major gripes with PEP-8, and if you ever see me using it I want you to shoot me in the face. 

  2. Though hopefully with less drama. 

Read the whole story
14 days ago
Boulder, CO
Share this story

Notes on using git-replace to get rid of giant objects

1 Comment

A couple of years ago someone accidentally committed a 350 megabyte file to our Git repository. Now it's baked in. I wanted to get rid of it. I thought that I might be able to work out a partial but lightweight solution using git-replace.

Summary: It didn't work.


In 2016 a programmer commited a 350 megabyte file to my employer's repo, then in the following commit they removed it again. Of course it's still in there, because someone might check out the one commit where it existed. Everyone who clones the repo gets a copy of the big file. Every copy of the repo takes up an extra 350 megabytes on disk.

The usual way to fix this is onerous:

  1. Use git-filter-branch to rebuild all the repository history after the bad commit.

  2. Update all the existing refs to point to the analogous rebuilt objects.

  3. Get everyone in the company to update all the refs in their local copies of the repo.

I thought I'd tinker around with git-replace to see if there was some way around this, maybe something that someone could do locally on their own repo without requiring everyone else to go along with it.

The git-replace command annotates the Git repository to say that whenever object A is wanted, object B should be used instead. Say that the 350 MB file has an ID of ffff9999ffff9999ffff9999ffff9999ffff9999. I can create a small file that says

 This is a replacement object.  It replaces a very large file
 that was committed by mistake.  To see the commit as it really
 was, use

      git --no-replace-objects show 183a5c7e90b2d4f6183a5c7e90b2d4f6183a5c7e
      git --no-replace-objects checkout 183a5c7e90b2d4f6183a5c7e90b2d4f6183a5c7e

 or similarly.  To see the file itself, use

      git --no-replace-objects show ffff9999ffff9999ffff9999ffff9999ffff9999

I can turn this small file into an object with git-add; say the new small object has ID 1111333311113333111133331111333311113333. I then run:

git replace ffff9999ffff9999ffff9999ffff9999ffff9999 1111333311113333111133331111333311113333

This creates .git/refs/replace/ffff9999ffff9999ffff9999ffff9999ffff9999, which contains the text 1111333311113333111133331111333311113333. thenceforward, any Git command that tries to access the original object ffff9999 will silently behave as if it were 11113333 instead. For example, git show 183a5c7e will show the diff between that commit and the previous, as if the user had committed my small file back in 2016 instead of their large one. And checking out that commit will check out the small file instead of the large one.

So far this doesn't help much. The checkout is smaller, but nobody was likely to have that commit checked out anyway. The large file is still in the repository, and clones and transfers still clone and transfer it.

The first thing I tried was a wan hope: will git gc discard the replaced object? No, of course not. The ref in refs/replace/ counts as a reference to it, and it will never be garbage-collected. If it had been, you would no longer be able to examine it with the --no-replace-objects commands. So much for following the rules!

Now comes the hacking part: I am going to destroy the actual object. Say for example, what if:

cp /dev/null .git/objects/ff/ff9999ffff9999ffff9999ffff9999ffff9999

Now the repository is smaller! And maybe Git won't notice, as long as I do not use --no-replace-objects?

Indeed, much normal Git usage doesn't notice. For example, I can make new commits with no trouble, and of course any other operation that doesn't go back as far as 2016 doesn't notice the change. And git-log works just fine even past the bad commit; it only looks at the replacement object and never notices that the bad object is missing.

But some things become wonky. You get an error message when you clone the repo because an object is missing. The replacement refs are local to the repo, and don't get cloned, so clone doesn't know to use the replacement object anyway. In the clone, you can use git replace -f .... to reinstate the replacement, and then all is well unless something tries to look at the missing object. So maybe a user could apply this hack on their own local copy if they are willing to tolerate a little wonkiness…?

No. Unfortunately, there is a show-stopper: git-gc no longer works in either the parent repo or in the clone:

fatal: unable to read ffff9999ffff9999ffff9999ffff9999ffff9999
error: failed to run repack

and it doesn't create the pack files. It dies, and leaves behind a .git/objects/pack/tmp_pack_XxXxXx that has to be cleaned up by hand.

I think I've reached the end of this road. Oh well, it was worth a look.

Read the whole story
14 days ago
I could use a way to do this.
Boulder, CO
Share this story
Next Page of Stories