February 16th, 2019

Posted by Andrew LaSane

Mural by Felipe Pantone. All images via Gingko Press

Not everyone is lucky enough to travel the world to witness the evolution of street art. Luckily there are books like Mural Masters: A New Generation published by Gingko Press to close those gaps. Authored by Kiriakos Iosifidis, the new book is over 260 pages long and showcases walls painted by more than 90 new and emerging artists.

With the help of many talented photographers, Mural Masters takes viewers on a non-linear journey across the planet, hitting Arkansas and Zurich and all points in between to check in on Alexis Diaz, Hyuro, Nychos, ETAM, and several others. Individual artist bios reveal details about where the creators are from, how long they’ve been honing their craft, and where in the world their pieces can be found. The book also includes a handy index filled with contact information and social media handles (for those who have them), as well as the locations and photographer credits for each mural included in its pages.

Mural Masters: A New Generation is available on shelves now, but you can save a trip and grab a copy here.

Mural by Okuda San Miguel

Mural by James Bullough & Li Hill

Mural by WD

Mural by Fintan Magee

Mural by Agostino Iacurci

Mural by Agostino Iacurci

Mural by DULK

Mural by Hendrik Beikirch

Let me tell you about the still-not-defunct real-time log processing pipeline we built at my now-defunct last job. It handled logs from a large number of embedded devices that our ISP operated on behalf of residential customers. (I wrote and presented previously about some of the cool wifi diagnostics that were possible with this data set.)

Lately, I've had a surprisingly large number of conversations about logs processing pipelines. I can find probably 10+ already-funded, seemingly successful startups processing logs, and the Big Name Cloud providers all have some kind of logs thingy, but still, people are not satisfied. It's expensive and slow. And if you complain, you mostly get told that you shouldn't be using unstructured logs anyway, you should be using event streams.

That advice is not wrong, but it's incomplete.

Instead of doing a survey of the whole unhappy landscape, let's just ignore what other people suffer with and talk about what does work. You can probably find, somewhere, something similar to each of the components I'm going to talk about, but you probably can't find a single solution that combines it all with good performance and super-low latency for a reasonable price. At least, I haven't found it. I was a little surprised by this, because I didn't think we were doing anything all that innovative. Apparently I was incorrect.

The big picture

Let's get started. Here's a handy diagram of all the parts we're going to talk about:

The ISP where I worked has a bunch of embedded Linux devices (routers, firewalls, wifi access points, and so on) that we wanted to monitor. The number increased rapidly over time, but let's talk about a nice round number, like 100,000 of them. Initially there were zero, then maybe 10 in our development lab, and eventually we hit 100,000, and later there were many more than that. Whatever. Let's work with 100,000. But keep in mind that this architecture works pretty much the same with any number of devices.

(It's a "distributed system" in the sense of scalability, but it's also the simplest thing that really works for any number of devices more than a handful, which makes it different from many "distributed systems" where you could have solved the problem much more simply if you didn't care about scaling. Since our logs are coming from multiple sources, we can't make it non-distributed, but we can try to minimize the number of parts that have to deal with the extra complexity.)

Now, these are devices we're monitoring, not apps or services or containers or whatever. That means two things: we had to deal with lots of weird problems (like compiler/kernel bugs and hardware failures), and most of the software was off-the-shelf OS stuff we couldn't easily control (or didn't want to rewrite).

(Here's the good news: because embedded devices have all the problems from top to bottom, any solution that works for my masses of embedded devices will work for any other log-pipeline problem you might have. If you're lucky, you can just leave out some parts.)

That means the debate about "events" vs "logs" was kind of moot. We didn't control all the parts in our system, so telling us to forget logs and use only structured events doesn't help. udhcpd produces messages the way it wants to produce messages, and that's life. Sometimes the kernel panics and prints whatever it wants to print, and that's life. Move on.

Of course, we also had our own apps, which means we could also produce our own structured events when it was relevant to our own apps. Our team had whole never-ending debates about which is better, logs or events, structured or unstructured. In fact, in a move only overfunded megacorporations can afford, we actually implemented both and ran them both for a long time.

Thus, I can now tell you the final true answer, once and for all: you want structured events in your database.

...but you need to be able to produce them from unstructured logs. And once you can do that, exactly how those structured events are produced (either from logs or directly from structured trace output) turns out to be irrelevant.

But we're getting ahead of ourselves a bit. Let's take our flow diagram, one part at a time, from left to right.

Userspace and kernel messages, in a single stream

Some people who have been hacking on Linux for a while may know about /proc/kmsg: that's the file good old (pre-systemd) klogd reads kernel messages from, and pumps them to syslogd, which saves them to a file. Nowadays systemd does roughly the same thing but with more d-bus and more corrupted binary log files. Ahem. Anyway. When you run the dmesg command, it reads the same messages (in a slightly different way).

What you might not know is that you can go the other direction. There's a file called /dev/kmsg (note: /dev and not /proc) which, if you write to it, produces messages into the kernel's buffer. Let's do that! For all our messages!

Wait, what? Am I crazy? Why do that?

Because we want strict sequencing of log messages between programs. And we want that even if your kernel panics.

Imagine you have, say, a TV DVR running on an embedded Linux system, and whenever you go to play a particular recorded video, the kernel panics because your chipset vendor hates you. Hypothetically. (The feeling is, hypothetically, mutual.) Ideally, you would like your logs to contain a note that the user requested the video, the video is about to start playing, we've opened the file, we're about to start streaming the file to the proprietary and very buggy (hypothetical) video decoder... boom. Panic.

What now? Well, if you're writing the log messages to disk, the joke's on you, because I bet you didn't fsync() after each one. (Once upon a time, syslogd actually did fsync() after each one. It was insanely disk-grindy and had very low throughput. Those days are gone.) Moreover, a kernel panic kills the disk driver, so you have no chance to fsync() it after the panic, unless you engage one of the more terrifying hacks like, after a panic, booting into a secondary kernel whose only job is to stream the message buffer into a file, hoping desperately that the disk driver isn't the thing that panicked, that the disk itself hasn't fried, and that even if you do manage to write to some disk blocks, they are the right ones because your filesystem data structure is reasonably intact.

(I suddenly feel a lot of pity for myself after reading that paragraph. I think I am more scars than person at this point.)


The kernel log buffer is in a fixed-size memory buffer in RAM. It defaults to being kinda small (tends or hundreds of kBytes), but you can make it bigger if you want. I suggest you do so.

By itself, this won't solve your kernel panic problems, because RAM is even more volatile than disk, and you have to reboot after a kernel panic. So the RAM is gone, right?

Well, no. Sort of. Not exactly.

Once upon a time, your PC BIOS would go through all your RAM at boot time and run a memory test. I remember my ancient 386DX PC used to do this with my amazingly robust and life-changing 4MB of RAM. It took quite a while. You could press ESC to skip it if you were a valiant risk-taking rebel like myself.

Now, memory is a lot faster than it used to be, but unfortunately it has gotten bigger more quickly than it has gotten faster, especially if you disable memory caching, which you certainly must do at boot time in order to write the very specific patterns needed to see if there are any bit errors.

So... we don't. That ended years ago. If you reboot your system, the memory mostly will contain the stuff it contained before you rebooted. The OS kernel has to know that and zero out pages as they get used. (Sometimes the kernel gets fancy and pre-zeroes some extra pages when it's not busy, so it can hand out zero pages more quickly on demand. But it always has to zero them.)

So, the pages are still around when the system reboots. What we want to happen is:

  1. The system reboots automatically after a kernel panic. You can do this by giving your kernel a boot parameter like "panic=1", which reboots it after one second. (This is not nearly enough time for an end user to read and contemplate the panic message. That's fine, because a) on a desktop PC, X11 will have crashed in graphics mode so you can't see the panic message anyway, and b) on an embedded system there is usually no display to put the message on. End users don't care about panic messages. Our job is to reboot, ASAP, so they don't try to "help" by power cycling the device, which really does lose your memory.) (Advanced users will make it reboot after zero seconds. I think panic=0 disables the reboot feature rather than doing that, so you might have to patch the kernel. I forget. We did it, whatever it was.)

  2. The kernel always initializes the dmesg buffer in the same spot in RAM.

  3. The kernel notices that a previous dmesg buffer is already in that spot in RAM (because of a valid signature or checksum or whatever) and decides to append to that buffer instead of starting fresh.

  4. In userspace, we pick up log processing where we left off. We can capture the log messages starting before (and therefore including) the panic!

  5. And because we redirected userspace logs to the kernel message buffer, we have also preserved the exact sequence of events that led up to the panic.

If you want all this to happen, I have good news and bad news. The good news is we open sourced all our code; the bad news is it didn't get upstreamed anywhere so there are no batteries included and no documentation and it probably doesn't quite work for your use case. Sorry.

Open source code:

  • logos tool for sending userspace logs to /dev/klogd. (It's logs... for the OS.. and it's logical... and it brings your logs back from the dead after a reboot... get it? No? Oh well.) This includes two per-app token buckets (burst and long-term) so that an out-of-control app won't overfill the limited amount of dmesg space.

  • PRINTK_PERSIST patch to make Linux reuse the dmesg buffer across reboots.

Even if you don't do any of the rest of this, everybody should use PRINTK_PERSIST on every computer, virtual or physical. Seriously. It's so good.

(Note: room for improvement: it would be better if we could just redirect app stdout/stderr directly to /dev/kmsg, but that doesn't work as well as we want. First, it doesn't auto-prefix incoming messages with the app name. Second, libc functions like printf() actually write a few bytes at a time, not one message per write() call, so they would end up producing more than one dmesg entry per line. Third, /dev/kmsg doesn't support the token bucket rate control that logos does, which turns out to be essential, because sometimes apps go crazy. So we'd have to further extend the kernel API to make it work. It would be worthwhile, though, because the extra userspace process causes an unavoidable delay between when a userspace program prints something and when it actually gets into the kernel log. That delay is enough time for a kernel to panic, and the userspace message gets lost. Writing directly to /dev/kmsg would take less CPU, leave userspace latency unchanged, and ensure the message is safely written before continuing. Someday!)

(In related news, all of syslogd is kinda extraneous for this reason. So is whatever systemd does. Why do we make everything so complicated? Just write directly to files or the kernel log buffer. It's cheap and easy.)

Uploading the logs

Next, we need to get the messages out of the kernel log buffer and into our log processing server, wherever that might be.

(Note: if we do the above trick - writing userspace messages to the kernel buffer - then we can't also use klogd to read them back into syslogd. That would create an infinite loop, and would end badly. Ask me how I know.)

So, no klogd -> syslogd -> file. Instead, we have something like syslogd -> kmsg -> uploader or app -> kmsg -> uploader.

What is a log uploader? Well, it's a thing that reads messages from the kernel kmsg buffer as they arrive, and uploads them to a server, perhaps over https. It might be almost as simple as "dmesg | curl", like my original prototype, but we can get a bit fancier:

  • Figure out which messages we've already uploaded (eg. from the persistent buffer before we rebooted) and don't upload those again.

  • Log the current wall-clock time before uploading, giving us sync points between monotonic time (/dev/kmsg logs "microseconds since boot" by default, which is very useful, but we also want to be able to correlate that with "real" time so we can match messages between related machines).

  • Compress the file on the way out.

  • Somehow authenticate with the log server.

  • Bonus: if the log server is unavailable because of a network partition, try to keep around the last few messages from before the partition, as well as the recent messages once the partition is restored. If the network partition was caused by the client - not too rare if you, like us, were in the business of making routers and wifi access points - you really would like to see the messages from right before the connectivity loss.

Luckily for you, we also open sourced our code for this. It's in C so it's very small and low-overhead. We never quite got the code for the "bonus" feature working quite right, though; we kinda got interrupted at the last minute.

Open source code:

  • loguploader C client, including an rsyslog plugin for Debian in case you don't want to use the /dev/kmsg trick.

  • devcert, a tool (and Debian package) which auto-generates a self signed "device certificate" wherever it's installed. The device certificate is used by a device (or VM, container, whatever) to identify itself to the log server, which can then decide how to classify and store (or reject) its logs.

One thing we unfortunately didn't get around to doing was modifying the logupload client to stream logs to the server. This is possible using HTTP POST and Chunked encoding, but our server at the time was unable to accept streaming POST requests due to (I think now fixed) infrastructure limitations.

(Note: if you write load balancing proxy servers or HTTP server frameworks, make sure they can start processing a POST request as soon as all the headers have arrived, rather than waiting for the entire blob to be complete! Then a log upload server can just stream the bytes straight to the next stage even before the whole request has finished.)

Because we lacked streaming in the client, we had to upload chunks of log periodically, which leads to a tradeoff about what makes a good upload period. We eventually settled on about 60 seconds, which ended up accounting for almost all the end-to-end latency from message generation to our monitoring console.

Most people probably think 60 seconds is not too bad. But thanks to the awesome team I was working with, we managed to squeeze all the other pipeline phases down to tens of milliseconds in total. So the remaining 60 seconds (technically: anywhere from 0 to 60 seconds after a message was produced) was kinda embarrassing.

The log receiver

So okay, we're uploading the logs from client to some kind of server. What does the server do?

This part is both the easiest and the most reliability-critical. The job is this: receive an HTTP POST request, write the POST data to a file, and return HTTP 200 OK. Anybody who has any server-side experience at all can write this in their preferred language in about 10 minutes.

We intentionally want to make this phase as absolutely simplistic as possible. This is the phase that accepts logs from the limited-size kmsg buffer on the client and puts them somewhere persistent. It's nice to have real-time alerts, but if I have to choose between somewhat delayed alerts or randomly losing log messages when things get ugly, I'll have to accept the delayed alerts. Don't lose log messages! You'll regret it.

The best way to not lose messages is to minimize the work done by your log receiver. So we did. It receives the uploaded log file chunk and appends it to a file, and that's it. The "file" is actually in a cloud storage system that's more-or-less like S3. When I explained this to someone, they asked why we didn't put it in a Bigtable-like thing or some other database, because isn't a filesystem kinda cheesy? No, it's not cheesy, it's simple. Simple things don't break. Our friends on the "let's use structured events to make metrics" team streamed those events straight into a database, and it broke all the time, because databases have configuration options and you inevitably set those options wrong, and it'll fall over under heavy load, and you won't find out until you're right in the middle of an emergency and you really want to see those logs. Or events.

Of course, the file storage service we used was encrypted-at-rest, heavily audited, and auto-deleted files after N days. When you're a megacorporation, you have whole teams of people dedicated to making sure you don't screw this up. They will find you. Best not to annoy them.

We had to add one extra feature, which was authentication. It's not okay for random people on the Internet to be able to impersonate your devices and spam your logs - at least without putting some work into it. For device authentication, we used the rarely-used HTTP client-side certificates option and the devcert program (linked above) so that the client and server could mutually authenticate each other. The server didn't check the certificates against a certification authority (CA), like web clients usually do; instead, it had a database with a whitelist of exactly which certs we're allowing today. So in case someone stole a device cert and started screwing around, we could remove their cert from the whitelist and not worry about CRL bugs and latencies and whatnot.

Unfortunately, because our log receiver was an internal app relying on internal infrastructure, it wasn't open sourced. But there really wasn't much there, honest. The first one was written in maybe 150 lines of python, and the replacement was rewritten in slightly more lines of Go. No problem.

Retries and floods

Of course, things don't always go smoothly. If you're an ISP, the least easy thing is dealing with cases where a whole neighbourhood gets disconnected, either because of a power loss or because someone cut the fiber Internet feed to the neighbourhood.

Now, disconnections are not such a big deal for logs processing - you don't have any. But reconnection is a really big deal. Now you have tens or hundreds of thousands of your devices coming back online at once, and a) they have accumulated a lot more log messages than they usually do, since they couldn't upload them, and b) they all want to talk to your server at the same time. Uh oh.

Luckily, our system was designed carefully (uh... eventually it was), so it could handle these situations pretty smoothly:

  1. The log uploader uses a backoff timer so that if it's been trying to upload for a while, it uploads less often. (However, the backoff timer was limited to no more than the usual inter-upload interval. I don't know why more people don't do this. It's rather silly for your system to wait longer between uploads in a failure situation than it would in a success situation. This is especially true with logs, where when things come back online, you want a status update now. And clearly your servers have enough capacity to handle uploads at the usual rate, because they usually don't crash. Sorry if I sound defensive here, but I had to have this argument a few times with a few SREs. I understand why limiting the backoff period isn't always the right move. It's the right move here.)

  2. Less obviously, even under normal conditions, the log uploader uses a randomized interval between uploads. This avoids traffic spikes where, after the Internet comes back online, everybody uploads again exactly 60 seconds later, and so on.

  3. The log upload client understands the idea that the server can't accept its request right now. It has to, anyway, because if the Internet goes down, there's no server. So it treats server errors exactly like it treats lack of connectivity. And luckily, log uploading is not really an "interactive" priority task, so it's okay to sacrifice latency when things get bad. Users won't notice. And apparently our network is down, so the admins already noticed.

  4. The /dev/kmsg buffer was configured for the longest reasonable outage we could expect, so that it wouldn't overflow during "typical" downtime. Of course, there's a judgement call here. But the truth is, if you're having system-wide downtime, what the individual devices were doing during that downtime is not usually what you care about. So you only need to handle, say, the 90th percentile of downtime. Ignore the black swans for once.

  5. The log receiver aggressively rejects requests that come faster than its ability to write files to disk. Since the clients know how to retry with a delay, this allows us to smooth out bursty traffic without needing to either over-provision the servers or lose log messages.

    (Pro tip: if you're writing a log receiver in Go, don't do the obvious thing and fire off a goroutine for every incoming request. You'll run out of memory. Define a maximum number of threads you're willing to handle at once, and limit your request handling to that. It's okay to set this value low, just to be safe: remember, the uploader clients will come back later.)

Okay! Now our (unstructured) logs from all our 100,000 devices are sitting safely in a big distributed filesystem. We have a little load-balanced, multi-homed cluster of log receivers accepting the uploads, and they're so simple that they should pretty much never die, and even if they do because we did something dumb (treacherous, treacherous goroutines!), the clients will try again.

What might not be obvious is this: our reliability, persistence, and scaling problems are solved. Or rather, as long as we have enough log receiver instances to handle all our devices, and enough disk quota to store all our logs, we will never again lose a log message.

That means the rest of our pipeline can be best-effort, complicated, and frequently exploding. And that's a good thing, because we're going to start using more off-the-shelf stuff, we're going to let random developers reconfigure the filtering rules, and we're not going to bother to configure it with any redundancy.

Grinding the logs

The next step is to take our unstructured logs and try to understand them. In other words, we want to add some structure. Basically we want to look for lines that are "interesting" and parse out the "interesting" data and produce a stream of events, each with a set of labels describing what categories they apply to.

Note that, other than this phase, there is little difference between how you'd design a structured event reporting pipeline and a log pipeline. You still need to collect the events. You still (if you're like me) need to persist your events across kernel panics. You still need to retry uploading them if your network gets partitioned. You still need the receivers to handle overloading, burstiness, and retries. You still would like to stream them (if your infrastructure can handle it) rather than uploading every 60 seconds. You still want to be able to handle a high volume of them. You're just uploading a structured blob instead of an unstructured blob.

Okay. Fine. If you want to upload structured blobs, go for it. It's just an HTTP POST that appends to a file. Nobody's stopping you. Just please try to follow my advice when designing the parts of the pipeline before and after this phase, because otherwise I guarantee you'll be sad eventually.

Anyway, if you're staying with me, now we have to parse our unstructured logs. What's really cool - what makes this a killer design compared to starting with structured events in the first place - is that we can, at any time, change our minds about how to parse the logs, without redeploying all the software that produces them.

This turns out to be amazingly handy. It's so amazingly handy that nobody believes me. Even I didn't believe me until I experienced it; I was sure, in the beginning, that the unstructured logs were only temporary and we'd initially use them to figure out what structured events we wanted to record, and then modify the software to send those, then phase out the logs over time. This never happened. We never settled down. Every week, or at least every month, there was some new problem which the existing "structured" events weren't configured to catch, but which, upon investigating, we realized we could diagnose and measure from the existing log message stream. And so we did!

Now, I have to put this in perspective. Someone probably told you that log messages are too slow, or too big, or too hard to read, or too hard to use, or you should use them while debugging and then delete them. All those people were living in the past and they didn't have our fancy log pipeline. Computers are really, really fast now. Storage is really, really cheap.

So we let it all out. Our devices produced an average of 50 MB of (uncompressed) logs per day, each. For the baseline 100,000 devices that we discussed above, that's about 5TB of logs per day. Ignoring compression, how much does it cost to store, say, 60 days of logs in S3 at 5TB per day? "Who cares," that's how much. You're amortizing it over 100,000 devices. Heck, a lot of those devices were DVRs, each with 2TB of storage. With 100,000 DVRs, that's 200,000 TB of storage. Another 300 is literally a rounding error (like, smaller than if I can't remember if it's really 2TB or 2TiB or what).

Our systems barfed up logs vigorously and continuously, like a non-drunken non-sailor with seasickness. And it was beautiful.

(By the way, now would be a good time to mention some things we didn't log: personally identifiable information or information about people's Internet usage habits. These were diagnostic logs for running the network and detecting hardware/software failures. We didn't track what you did with the network. That was an intentional decision from day 1.)

(Also, this is why I think all those log processing services are so badly overpriced. I wanna store 50 MB per device, for lots of devices. I need to pay S3 rates for that, not a million dollars a gigabyte. If I have to overpay for storage, I'll have to start writing fewer logs. I love my logs. I need my logs. I know you're just storing it in S3 anyway. Let's be realistic.)

But the grinding, though

Oh right. So the big box labeled "Grinder" in my diagram was, in fact, just one single virtual machine, for a long time. It lasted like that for much longer than we expected.

Whoa, how is that possible, you ask?

Well, at 5TB per day per 100,000 devices, that's an average of 57 MBytes per second. And remember, burstiness has already been absorbed by our carefully written log receivers and clients, so we'll just grind these logs as fast as they arrive or as fast as we can, and if there are fluctuations, they'll average out. Admittedly, some parts of the day are busier than others. Let's say 80 MBytes per second at peak.

80 MBytes per second? My laptop can do that on its spinning disk. I don't even need an SSD! 80 MBytes per second is a toy.

And of course, it's not just one spinning disk. The data itself is stored on some fancy heavily-engineered distributed filesystem that I didn't have to design. Assuming there are no, er, collossal, failures in provisioning (no comment), there's no reason we shouldn't be able to read files at a rate that saturates the network interface available to our machine. Surely that's at least 10 Gbps (~1 GByte/sec) nowadays, which is 12.5 of those. 1.25 million devices, all processed by a single grinder.

Of course you'll probably need to use a few CPU cores. And the more work you do per log entry, the slower it'll get. But these estimates aren't too far off what we could handle.

And yeah, sometimes that VM gets randomly killed by the cluster's Star Trek-esque hive mind for no reason. It doesn't matter, because the input data was already persisted by the log receivers. Just start a new grinder and pick up where you left off. You'll have to be able to handle process restarts no matter what. And that's a lot easier than trying to make a distributed system you didn't need.

As for what the grinder actually does? Anything you want. But it's basically the "map" phase in a mapreduce. It reads the data in one side, does some stuff to it, and writes out postprocessed stuff on the other side. Use your imagination. And if you want to write more kinds of mappers, you can run them, either alongside the original Grinder or downstream from it.

Our Grinder mostly just ran regexes and put out structures (technically protobufs) that were basically sets of key-value pairs.

(For some reason, when I search the Internet for "streaming mapreduce," I don't get programs that do this real-time processing of lots of files as they get written. Instead, I seem to get batch-oriented mapreduce clones that happen to read from stdin, which is a stream. I guess. But... well, now you've wasted some perfectly good words that could have meant something. So okay, too bad, it's a Grinder. Sue me.)

Reducers and Indexers

Once you have a bunch of structured events... well, I'm not going to explain that in a lot of detail, because it's been written about a lot.

You probably want to aggregate them a bit - eg. to count up reboots across multiple devices, rather than storing each event for each device separately - and dump them into a time-series database. Perhaps you want to save and postprocess the results in a monitoring system named after Queen Elizabeth or her pet butterfly. Whatever. Plug in your favourite.

What you probably think you want to do, but it turns out you rarely need, is full-text indexing. People just don't grep the logs across 100,000 devices all that often. I mean, it's kinda nice to have. But it doesn't have to be instantaneous. You can plug in your favourite full text indexer if you like. But most of the time, just an occasional big parallel grep (perhaps using your favourite mapreduce clone or something more modern... or possibly just using grep) of a subset of the logs is sufficient.

(If you don't have too many devices, even a serial grep can be fine. Remember, a decent cloud computer should be able to read through ~1 GByte/sec, no problem. How much are you paying for someone to run some bloaty full-text indexer on all your logs, to save a few milliseconds per grep?)

I mean, run a full text indexer if you want. The files are right there. Don't let me stop you.

On the other hand, being able to retrieve the exact series of logs - let's call it the "narrative" - from a particular time period across a subset of devices turns out to be super useful. A mini-indexer that just remembers which logs from which devices ended up in which files at which offsets is nice to have. Someone else on our team built one of those eventually (once we grew so much that our parallel grep started taking minutes instead of seconds), and it was very nice.

And then you can build your dashboards

Once you've reduced, aggregated, and indexed your events into your favourite output files and databases, you can read those databases to build very fast-running dashboards. They're fast because the data has been preprocessed in mostly-real time.

As I mentioned above, we had our pipeline reading the input files as fast as they could come in, so the receive+grind+reduce+index phase only took a few tens of milliseconds. If your pipeline isn't that fast, ask somebody why. I bet their program is written in java and/or has a lot of sleep() statements.

Again here, I'm not going to recommend a dashboard tool. There are millions of articles and blog posts about that. Pick one, or many.

In conclusion

Please, please, steal these ideas. Make your log and event processing as stable as our small team made our log processing. Don't fight over structured vs unstructured; if you can't agree, just log them both.

Don't put up with weird lags and limits in your infrastructure. We made 50MB/day/device work for a lot of devices, and real-time mapreduced them all on a single VM. If we can do that, then you can make it work for a few hundreds, or a few thousands, of container instances. Don't let anyone tell you you can't. Do the math: of course you can.


Eventually our team's log processing infrastructure evolved to become the primary monitoring and alerting infrastructure for our ISP. Rather than alerting on behaviour of individual core routers, it turned out that the end-to-end behaviour observed by devices in the field were a better way to detect virtually any problem. Alert on symptoms, not causes, as the SREs like to say. Who has the symptoms? End users.

We had our devices ping different internal servers periodically and log the round trip times; in aggregate, we had an amazing view of overloading, packet loss, bufferbloat, and poor backbone routing decisions, across the entire fleet, across every port of every switch. We could tell which was better, IPv4 or IPv6. (It's always IPv4. Almost everyone spends more time optimizing their IPv4 routes and peering. Sorry, but it's true.)

We detected some weird configuration problems with the DNS servers in one city by comparing the 90th percentile latency of DNS lookups across all the devices in every city.

We diagnosed a manufacturing defect in a particular batch of devices, just based on their CPU temperature curves and fan speeds.

We worked with our CPU vendor to find and work around a bug in their cache coherency, because we spotted a kernel panic that would happen randomly every 10,000 CPU-hours, but for every 100,000 devices, that's still 10 times per hour of potential clues.

...and it sure was good for detecting power failures.

Anyway. Log more stuff. Collect those logs. Let it flow. Trust me.

james_davis_nicoll: (Default)
posted by [personal profile] james_davis_nicoll at 09:35am on 16/02/2019 under
andrewducker: (Default)
posted by [syndicated profile] newelementary_feed at 12:30pm on 16/02/2019

Posted by Tim Johnson

Sven Franic reviews LEGO® 21316 The Flintstones for us today but will it be a yabba dabba doo or a yabba dabba don't? The set is available to VIPs this Wednesday 20 February 2019 with 748 pieces, four minifigures and retails for US$59.99/ CA$79.99/ 59.99€/ £54.99/ 549DKK/ AU$99.99.

LEGO Ideas is a constantly evolving system, but there are a couple of individuals who seem to have cracked the magic formula. This is fan designer Andrew Clark’s second Ideas submission to make it all the way to store shelves after 21304 Doctor Who in 2016.

Continue reading »
supergee: (needle)
posted by [personal profile] supergee at 07:57am on 16/02/2019 under
Rap music is used to get people hooked on drugs. [Central Maine]
posted by [syndicated profile] fictionmachine_feed at 09:47am on 16/02/2019

Posted by Grant Watson

Decades after leaving the Hundred Acre Wood for boarding school, troubled veteran Christopher Robin (Ewan McGregor) now works in middle-management at a London-based luggage company. Neglecting his wife (Hayley Atwell) and daughter for his job, Christopher reaches an ethical crisis when he is confronted with firing half of his staff to save the company money. It is at this point that his childhood teddy bear Winnie the Pooh (Jim Cummings) returns to up-end his life.

In recent years the Walt Disney Company has built up a remarkably successful side-line in adapting their existing animated features into live-action. These have essentially been seeing release on an annual basis, although 2018 is set to witness three of them hit the cinemas (Dumbo, Aladdin, and The Lion King, in something of a release schedule traffic jam). 2018’s effort was Marc Forster’s Christopher Robin, which was met with moderate commercial success but what felt like popular disinterest. It came, it made a profit, but it never felt as if it caused so much of a ripple in popular culture. It was not really talked about, and seemed to receive no positive or negative responses at all. I must admit that I overlooked its release entirely.

That’s a minor shame. I say minor because Christopher Robin is, ultimately, a minor sort of a film. It does not feel artistically necessary, nor commercially savvy, and seems destined to be half-forgotten within a year of its release. It is, however, also a shame, since Marc Forster has directed something gentle, pleasant, and ultimately very sweet. Fans of Disney’s particularly style of A.A. Milne’s Winnie the Pooh characters will be particularly charmed. Children with patience for the film’s slow pace will likely be delighted.

The film’s design and effects are wonderful, using Disney’s vision of the Pooh characters with a sort of worn, threadbare aesthetic that makes these legacy characters feel somewhat like a cross between Disney’s designs and the original E.H. Shepard illustrations from Milne’s novels. The animation of these characters, blending puppetry movement and CGI, is the highlight. They feel real, and lovable, and bring back numerous childhood memories. They are also beautifully voiced, including such spot-on castings as Peter Capaldi as Rabbit, Toby Jones as Owl, and Brad Garrett as Eeyore.

As Pooh, Jim Cummings is doing the best work of his career. The character’s distinctive creaky voice, sounding both elderly and young, was developed by the legendary Sterling Holloway back in 1966. Cummings took over the role in 1988, after Holloway’s death. Over the decades Cummings has done an excellent act of mimicry in the role. Here he exceeds it with a tremendous act of nuance and gentle, heartfelt emotion. He is funny regularly, but here he also extends to the heart-breaking. It is a shame that the Academy Awards do not specifically avoid voice-acting; this year he would be a top contender. His work is not just talented. It is definitive.

Ewan McGregor brings a lot of polite, mannered appeal to the title role, but of course he is already a veteran of acting against CGI images, thanks to his three-film role as Oni-Wan Kenobi in the Star Wars prequels. Hayley Atwell is good, but saddled with an unwisely small role as Christopher’s long-suffering wife. As Christopher’s unsympathetic boss is Mark Gatiss, who essentially plays Mark Gatiss: a few exceptions aside, like last year’s The Favourite, he always seems to play the same kind of character in the same kind of way.

The film is admittedly too slow, and it takes much too long to bring in the animated characters that will capture the audience’s attention and particularly entertain the young target audience. It is beautifully designed and filmed. It brings a very strong sense of prestige and importance too – after all it is Winnie the Pooh’s live-action feature debut – but as an entertaining, mainstream family film is doing feel just a little lacking where it counts. It feels admirable rather than excellent, and once every so often it feels like an absolute delight.

posted by [syndicated profile] dg_weblog_feed at 07:00am on 16/02/2019

Posted by diamond geezer

If you're reading this 7am, there are precisely 1000 hours to go until the UK leaves the EU (or doesn't, or delays making a decision until a later date, or whatever).

And still nobody is any clearer about what the final outcome of those 1000 hours will be.

We're here thanks to a sequence of collective decisions with unexpected outcomes.






Not forgetting these key choices...







Next we'll be facing this...



...and then, who knows?

There's no Parliamentary majority for any of these.


Even trying to extend Article 50 might not save us.


Asking the public probably wouldn't settle anything.




The Prime Minister seems keen to drag the final vote right down to the wire.


But if nobody makes a decision, then the default option is fixed.


Imagine getting to the last days of March with no decision in sight.


Or perhaps a No Deal scenario wouldn't be so bad after all.


In 1000 hours time we should have a much clearer idea of our nation's fate.


But still nobody knows what that fate will be...


...nor how we get there.
posted by [syndicated profile] crpgaddict_feed at 12:00am on 16/02/2019

Posted by CRPG Addict

The title shows that, just as with The Game of Dungeons, "DND" was just a file name, not the game name.
If you haven't had a chance to check out the "Data Driven Gamer," it's worth a visit. The author, Ahab, is still building his readership base, much like I was in 2011. He's more expansive in his selection of games than I am, but his particular focus is to analyze the games' quantitative elements, while still supplying a lot of commentary on the qualitative ones.

Ahab did a great job in the last couple of years analyzing The Dungeon and The Game of Dungeons, prompting me to go back and win those games. But those contributions pale in comparison to what he did last month. For the first time that I'm aware of, he figured out how to get a version of Daniel Lawrence's DND operating on a VAX emulator. For decades, we've had to reconstruct this missing link between the PLATO Game of Dungeons and the commercial Telengard based on player memories, adaptations, and interpretations of source code. Ahab not only showed the game in action, but he won it and supplied a full set of maps (for one of the three dungeons) as part of the process. His material is key to understanding this particular, peculiar line of CRPGs. Among other things, the ability to actually play this game shows that only the file name was DND; the title was--copyrights be damned--Dungeons & Dragons.
Gameplay in the VMS/VAX DND. My graphics are all messed up because of a line feed issue that I can't solve. The dungeon walls don't really look this chaotic.
Untangling the history of this particular lineage has been difficult, largely because of horrendous misinformation, much of it perpetrated (or at least not corrected) by Lawrence himself, who died in 2010 at the age of 52. (Among other things, he explicitly designated this page, which is so hopelessly confused I don't know where to begin, as the "official DND site." The authors do deserve credit for aggregating and preserving important files.) To read some sites, Lawrence is the father of the entire CRPG line, having written the first DND as early as 1972--two years before tabletop Dungeons & Dragons! His game was so popular, some articles have alleged, that students at the University of Indiana decided to adapt it as The Game of Dungeons. (Of course, it was the other way around.) Even writers who haven't so thoroughly confused the timeline have accepted Lawrence's assertions that he wrote "his" DND entirely on his own, with no reference to any other game, despite that it clearly borrows elements from the PLATO Game of Dungeons and Lawrence went to a university (Purdue) connected to PLATO. In a 2007 interview with Matt Barton, he suggests that his "play testers" might have played The Game of Dungeons and brought ideas to him. To me, such a scenario doesn't begin to explain the similarities between the games.
Daniel Lawrence in an undated photograph. Credit unknown.
The best truth that I can determine with the available evidence is that Lawrence wrote his first version of DND in 1976 or 1977, clearly after being exposed to The Game of Dungeons on PLATO. I'm inclined to think that 1977 is the more likely date, since DND is closer in similarity to Version 6 of The Game of Dungeons, which wasn't released until 1977. Then again, elements of The Game Version 8 (1978) also seem to show up in Lawrence's work, so it's possible he went back to the well several times during the development of his adaptation. The existence of several mainframe versions would support this thesis.

As we'll see, Lawrence made plenty of additions, and to recognize that he plagiarized from The Game is not to deny his own skill and innovations. His primary contribution was releasing the game to the wider world, first by writing a version for Purdue's DEC RSTS/E system. (In Lawrence's own words, the game was "the cause of more than one student dropping out" and "made me very unpopular with the computing staff at Purdue.") Engineers from DEC maintaining Purdue's system became familiar with the game and liked it so much that in 1979, they asked Lawrence to come to their Massachusetts headquarters and write a port for DEC's PDP-10 mainframe running the TOPS-20 operating system. (There are hints within DEC documents that Lawrence may have been paid for this, and that DEC's intention was to offer the game with its installations. The specific agreement between Lawrence and DEC has not come to light.) This version was subsequently disseminated in many locations where DECs were installed. The VMS/VAX version that Ahab got running seems to have been ported from this mainframe version.

By then, Lawrence had already been porting the game to the micro-computer. In 1978, he wrote a version for the Commodore PET that he titled Telengard, which had been the name of one of the explorable dungeons in DND. Representatives from Avalon Hill ran into Lawrence demoing the game at a convention in 1980 or 1981 and offered him a publishing deal, which ultimately saw PET, Commodore 64, Apple II, TRS-80, Atari 800, and MS-DOS releases starting in 1981 or 1982.
The title screen from the Commodore PET version of Telengard. The 1981 date seems unlikely as the actual release year.
(None of the histories of Lawrence or Telengard mention the specific convention at which this meeting occurred, but I found a likely session in the GenCon XIV program from August 1981. Unless Lawrence ran the same competition multiple years [I can't find the previous year's catalog], it seems unlikely that Telengard had a pre-1982 release date despite the copyright date on some versions of the game.)
In 1981, Lawrence ran a "contest" in which players competed for high scores or other status in some version of DND. Someone from Avalon Hill attended the session, and the result was the commercial Telengard.
From then on, Lawrence and Avalon Hill waged war on the ubiquitously-released free versions of the game, ordering their removal from every system on which they appeared. For its part, DEC acceded to legal threats from Avalon Hill, resulting in the modern difficulty reconstructing what those early versions looked like. You can read a long, fun e-mail chain here in which DEC employees try to argue law with their own legal department. Hilariously, various employees request assistance in finding the Orb throughout the thread while their exasperated bosses remind them that the game isn't supposed to exist on any DEC machine anymore.
A DEC executive orders the deletion of DND from DEC machines.
If Lawrence was guilty of some disingenuous behavior in trying to quash free versions of a game he partly plagiarized, it came back to bite him in repeated plagiarisms of his versions. We've seen plenty of them on this blog, including the so-called "Heathkit DND" (in actuality, also titled Dungeons and Dragons) of 1981, R.O. Software's DND (1984), and Thomas Hanlin's Caverns of Zoarre (1984). There are other BBS and shareware versions of the game that we haven't tried.
A DND "family tree."
That's the history. But what is Dungeons & Dragons? It's a text-based game with ASCII graphics in which a single character navigates one of three 20-level dungeons in a quest to retrieve a magic orb from a dragon. The layout of the dungeon and the locations of many of the special encounters are fixed, but the locations of combats and miscellaneous treasure finds are so random that you could encounter a never-ending stream of them from the same dungeon square. Combats are with a small menagerie of enemies, each with different strengths and vulnerabilities to the game's various spells. The character gains experience through both combat and treasure-finding, with miscellaneous encounters increasing and decreasing his attributes and providing him with magical gear. When he feels strong enough, he takes on the final dungeon level, recovers the orb, and--if he makes it back alive--gets his name on a leaderboard of "orb finders."

As I mentioned, there are too many elements copied directly from The Game of Dungeons for it to be remotely possible that Lawrence never saw it. These include:

  • The basic approach to game mechanics and goals, including the existence of permadeath.
  • A character creation process that includes a "secret name" for each character, serving as a kind of password
The need for a "secret name" is drawn from The Game of Dungeons, but the full set of attributes, the choice of character classes, and the choice of dungeons is new to DND.
  • The number of dungeon levels.
  • A main quest to recover an orb.
  • Carrying treasure out of the dungeon converts it to experience points.
My character levels up from a treasure haul.
  • A list of successful characters called "finders."
  • The existence of a transportation device, called "Excelsior," that moves you among the levels.
  • Basic combat options of (F)ight, (C)ast, and (E)vade.
  • A small number of monsters who have numeric levels assigned.
  • Many of the magic items are identical. Items can be trapped (although Lawrence's traps are more creative).
  • Treasure is found in both chests and random piles. Chests contain vastly more gold than the random piles.
  • Magic books that can raise or lower your attributes.
DND's handling of chests and books is the same as The Game of Dungeons.
  • Pits that you can fall down, dumping you on lower levels.
Luckily, I spotted this one.
It's also possible that Lawrence took a few elements from the earlier The Dungeon, including the organization of spells into a number of "slots" per level as well as some of the treasures you can find in the dungeon and their relative conversion to gold.

But Lawrence also added some new things to the Game of Dungeons template, some making it better, some making it poorer. These include:

  • DND has no graphics. Walls and corridors are ASCII characters and the main characters is represented as an X. The Game of Dungeons had graphics for geography, the PC, monsters, equipment, gold, and so forth.
  • Instead of just "gold," the player finds a variety of different treasure types that are converted to gold.
  • DND dungeon levels are much larger.
  • The Excelsior transporter exists on every level in DND, not just the top one.
  • A full set of tabletop Dungeons and Dragons attributes. The Game of Dungeons just had strength, intelligence, and dexterity. DND adds constitution and charisma.
A DND "character sheet."
  • While the character in Game was a multi-classed fighter/magic-user/cleric, DND has the player specify a choice of these classes. As such, combat is rebalanced so that you don't need to cast particular spells to ensure victory, and a pure fighter has a shot at winning. Spells, which could reliably one-shot certain enemies in The Game, are significantly reduced in power. They're also more in line with tabletop Dungeons and Dragons and, it must be said, a lot less silly than The Game.
  • There's no distinction between experience and gold in DND, as there was in Game through Version 5. The Game also changed to a single experience pool starting in Version 6, so Lawrence may have been influenced by the later one.
  • DND offers three dungeons to explore--Telengard, Svhenk's Lair, and Lamorte--each of which might contain the orb.
  • Game resolved combats all at once. DND shows round-for-round results.
DND's approach is generally better, but sometimes you wish it would just hurry up and get it done.
  • DND completely randomizes the appearance of treasure. The Game "seeded" each level with gold and chests whenever you entered, and you could clear the level, but in DND, treasure has a chance of showing up in every square as you move to it, including those you've already explored.
  • DND adds more special encounters at fixed locations, including thrones, altars, fountains, dragons' lairs, and doors with combination codes.
Special encounters with altars are a new element in DND.
  • Lawrence replaced the awkward "teleporters" with stairs that remain in a fixed location.
  • DND includes a greater variety of equipment, including magic weapons other than swords. The pluses go much higher, too. Where The Game capped at +3, DND allows higher than +20.
  • DND adds cute atmospheric messages as you explore. Examples: "A mutilated body lies on the floor nearby"; "'Turn back!!!' a voice screams"; "The room vibrates as if an army is passing by." There's even a reference to Colossal Cave Adventure and its hollow voice that says "PLUGH."

Finally, it's worth noting some of the changes between DND and Telengard:

  • Telengard has no main quest. The only objective is to get stronger and richer. For years, I thought this was a defining feature of the sub-genre, but it turns out that it's actually quite rare. Most variants have some kind of main quest.
  • Telengard's has only one dungeon, randomly drawn every time you start a new game.
  • The appearance of thrones, fountains, altars, and other special features are completely randomized, just like monsters and miscellaneous treasure. A player can encounter everything that Telengard has to offer by passing time in a single square.
  • Telengard has graphics.
  • Telengard has an expanded selection of items, including potions and scrolls.
Telengard is a nicer-looking game, but the greater randomization creates a chaotic experience.
Only the last item is a clear "improvement." Telengard is arguably a dumbing-down of gameplay in DND. The lack of any main quest is particularly notable, and one wonders why Lawrence or Avalon Hill made the decision to exclude one. Perhaps they thought the game had greater replayability if the only goal was to create a stronger character.

For all the ink writers like me have spent on Lawrence and his game, it arguably had the least impact of the major lineages that began in the late 1970s and early 1980s. During its day, DND offered perhaps the best simulation of the mechanics of tabletop role-playing on a computer, but its arrival on the micro-computer scene was far too late to have any impact. By the time that Telengard was released, it had already been outclassed by Ultima and games in the Moria/Oubliette/Wizardry line. The direct influence of DND can only really be felt in its few clones, for which there was so small a market that they had to be released as shareware.
Gameplay from the Heathkit Dungeons and Dragons (1981).

Gameplay in R.O. Software's DND (1984)
Gameplay in Caverns of Zoarre (1984)
There is one small exception, and to analyze it we must first note that DND did a reasonably good job anticipating the roguelike sub-genre. In fact, it's hard not to call it a pre-Rogue "roguelike," what with its random encounters, permadeath, and MacGuffin on the 20th floor. And yet it's hard to detect any direct influence on Rogue. (To some extent, Rogue feels like a game created by someone who heard about DND but never played it.) To my knowledge, the developers of Rogue have never acknowledged any direct influence except Star Trek (1971), Colossal Cave Adventure (1976), and a general desire to emulate table-top role-playing.

However, I do think that someone on the NetHack development team was exposed to DND, or at least Telengard. I base this on the variety of special encounters that were introduced to the game at some point between Hack and NetHack 2.3e, including thrones that do different things when you sit on them and offer the ability to pry gems out of them; fountains that have a variety of effects; and altars that ask for money. Granted, thrones, fountains, and altars are fantasy staples that may have been introduced independently, but the specific way that you use them is so similar to DND that I think there must be a connection. It's a minor legacy, but still worth acknowledging.
Sitting on thrones in NetHack has many of the same consequences as in DND.

Ahab was kind enough to send me the instructions I needed to emulate DND myself. I tried for a while, but I couldn't solve an issue (involving line feeds) that created chaos out of the dungeon maps. (The solution he offers on his blog didn't work for me despite us both having the same version of Windows.) Such a win would have been superfluous coming right on the heels of his own victory anyway. I may return to it at some point in the future, just for the statistic, but not soon.

This entry will serve as my final word on this line of games, which we've visited in bits and pieces since the first year of my blog. If any new information comes to light, I'll include edits in this entry rather than writing anew. In the meantime, there are dozens of web pages and Wiki articles that I don't imagine will be similarly corrected. Daniel Lawrence deserves credit for what he accomplished, but he is not the grandfather or even father of CRPGs.

siderea: (Default)
posted by [personal profile] siderea at 12:37am on 16/02/2019 under ,
I finally decided what I'm doing with rugs, at least in the main room, and ordered them, and also rug pags. I'm ordering from Overstock.com, which feels like something of a gamble, but a good gamble. I'm not sure I made the right choice, but I feel good that I made a choice, and can now move onto the next things on my list.

It abruptly dawned on me in the middle of checking out – at the point where it proposed to ship the rugs to my old address, not yet knowing about my new address – that, omg, my credit card company was going to see all these largish purchases on my card being shipped to an address that, as far as they know, isn't mine.

So I called my credit card company at five to midnight to explain that I was moving and to please not screw up my rug order, and, while I have you here, here's my new mailing address.

Posted by cks

One of the things that comes up over and over again when formatting output is that you want to output a list of things with some separator between them but you don't want this separator to appear at the start or the end, or if there is only one item in the list. For instance, suppose that you are formatting URL parameters in a tiny little shell script and you may have one or more parameters. If you have more than one parameter, you need to separate them with '&'; if you have only one parameter, the web server may well be unhappy if you stick an '&' before or after it.

(Or not. Web servers are often very accepting of crazy things in URLs and URL parameters, but one shouldn't count on it. And it just looks irritating.)

The very brute force approach to this general problem in Bourne shells goes like this:

for i in "$@"; do
  if [ -z "$tot" ]; then

But this is five or six lines and involves some amount of repetition. It would be nice to do better, so when I had to deal with this recently I looked into the Dash manpage to see if it's possible to do better with shell substitutions or something else clever. With shell substitutions we can condense this a lot, but we can't get rid of all of the repetition:


It annoys me that tot is repeated in this. However, this is probably the best all-around option in normal Bourne shell.

Bash has arrays, but the manpage's documentation of them makes my head hurt and this results in Bash-specific scripts (or at least scripts specific to any shell with support for arrays). I'm also not sure if there's any simple way of doing a 'join' operation to generate the array elements together with a separator between them, which is the whole point of the exercise.

(But now I've read various web pages on Bash arrays so I feel like I know a little bit more about them. Also, on joining, see this Stackoverflow Q&A; it looks like there's no built-in support for it.)

In the process of writing this entry, I realized that there is an option that exploits POSIX pattern substitution after generating our '$tot' to remove any unwanted prefix or suffix. Let me show you what I mean:

for i in "$@"; do
# remove leading '&':

This feels a little bit unclean, since we're adding on a separator that we don't want and then removing it later. Among other things, that seems like it could invite accidents where at some point we forget to remove that leading separator. As a result, I think that the version using '${var:+word}' substitution is the best option, and it's what I'm going to stick with.

siderea: (Default)
posted by [personal profile] siderea at 08:55pm on 15/02/2019 under ,
I went by my new place today and got the keys.

I brought a roll of toilet paper, because priorities, but there was one already there.

I hung out and did more measuring. The doors have a 3/4th inch clearance so a rug should be fine. There's no jambs for the closet doors, so there's no obvious way to sneak coax in front of them.

The broken blind was replaced. The heat is still wonky; according to the thermostat it was 74degF in there, despite the thermostat being set to 65. Will nudge the super on it. This may not actually be a thermostat malfunction (though the radiator in the bathroon is Doing It's Part For Global Warming(tm)) as much as it's a function of being on about the third floor (depending on whether or not you count the basement, which is where the laundry room is) and heat rising.

I've filed my change of address with USPS.

Posted by John

There are an infinite number of elliptic curves, but a small number that are used in cryptography, and these special curves have names. Apparently there are no hard and fast rules for how the names are chosen, but there are patterns.

The named elliptic curves are over a prime field, i.e. a finite field with a prime number of elements p. The number of points on the elliptic curve is on the order of p [1].

The curve names usually contain a number which is the number of bits in the binary representation of p. Let’s see how that plays out with a few named elliptic curves.

    | Name             | bits in p |
    | ANSSI FRP256v1   |       256 |
    | BN(2, 254)       |       254 |
    | brainpoolP256t1  |       256 |
    | Curve1174        |       251 |
    | Curve25519       |       255 |
    | Curve383187      |       383 |
    | E-222            |       222 |
    | E-382            |       382 |
    | E-521            |       521 |
    | Ed448-Goldilocks |       448 |
    | M-211            |       221 |
    | M-383            |       383 |
    | M-511            |       511 |
    | NIST P-224       |       224 |
    | NIST P-256       |       256 |
    | secp256k1        |       256 |

The first three curves in the list use a prime p that does not have a simple binary representation.

In Curve25519, p = 2255 – 19 and in Curve 383187, p = 2383 – 187. Here the number of bits in p is part of the name but another number is stuck on.

The only mystery on the list is Curve1174 where p has 251 bits. The equation for the curve is

x² + y² = 1 – 1174 y²

and so the 1174 in the name comes from a coefficient rather than from the number of bits in p.

Edwards curves

The equation for Curve1174 doesn’t look like an elliptic curve. It doesn’t have the familiar (Weierstrass) form

y² = x³ + ax + b

It is an example of an Edwards curve, named after Harold Edwards. So are all the curves above whose names start with “E”. These curves have the form

x² + y² = 1 + d x² y².

where d is not 0 or 1. So some Edwards curves are named after their d parameter and some are named after the number of bits in p.

It’s not obvious that an Edwards curve can be changed into Weierstrass form, but apparently it’s possible; this paper goes into the details.

The advantage of Edwards curves is that the elliptic curve group addition has a simple, convenient form. Also, when d is not a square in the underlying field, there are no exceptional points to consider for group addition.

Is d = -1174 a square in the field underlying Curve1174? For that curve p = 2251 – 9, and we can use the Jacobi symbol code from earlier this week to show that d is not a square.

    p = 2**251 - 9
    d = p-1174
    print(jacobi(d, p))

This prints -1, indicating that d is not a square. Note that we set d to p – 1174 rather than -1174 because our code assumes the first argument is positive, and -1174 and p – 1174 are equivalent mod p.

Related posts

[1] It is difficult to compute the exact number of points on an elliptic curve over a prime field. However, the number is roughly p ± 2√p. More precisely, Hasse’s theorem says

|\#(E/\mathbb{F}_p) - p - 1| \leq 2\sqrt{p}

February 15th, 2019

Posted by admin

900 years ago a group of people moved to Guangzhou from other parts of the province and settled close to the Pearl River. The Liede Village started to form.

Fast forwarding 900 years the city of Guangzhou is spreading and being modernized. The old village is demolished and new residential buildings are build. The old villagers got new apartments distributed to them from these new buildings on their land, many even 10 or more apartments.

Later shopping malls and business centers were build on the Liede Villagers land as well, a share of this income is distributed to the villagers, more than 50 000 CNY per year. No wonder Liede Village is called the riches Village in Guangzhou!

But old traditions aren’t forgotten! New family temples were built for the Li, Lin, Liang and Mai clans. Each year each of the hold a magnificent banquet, the biggest being the Li Clan with more than 500 people.

Yesterday we got to take part in the celebration of the Lin clan and it was an amazing experience. My company Expat Chinese hosted the event together with Banana Tour and 16 foreign and Chinese friends joined us.

The event started with lion dance performance followed by dragon dance and kungfu shows. Everyone gathered in front of the Lin Clan Tenple to watch the one hour show.

After the show it was time for dinner! Among three hundred tables we found our two tables and the dishes started to roll in one after another.

Each table seemed to have enough food to feed much more than just 10 people seated around it. We enjoyed local dishes of goose, chocked, pork, shrimp, vegetables, sea food and dessert. Soft drinks, apple vinegar cider and rice wine were served at every table as well.

Participating in this event was such an unique experience because usually only the family members and relatives were invited. Thank you Sissi from Banana Tour to make it happen!

posted by [syndicated profile] montecookgames_feed at 10:20pm on 15/02/2019

Posted by Tammie Webb Ryan

Cypher Chronicles, vol. 7-2019

Calaval’s Parable of the Imager and the Spider, loads of fulfillment info, call for GMs, and more…all in today’s Cypher Chronicles! You can get Cypher Chronicles, and other MCG news, delivered right to your inbox! Enter your email address and click the Subscribe button in the right-hand column, and you’ll never miss a post.

The post Cypher Chronicles, vol. 7-2019 appeared first on Monte Cook Games.

Posted by Laura Staugaitis

Based on “Joséphine-Éléonore-Marie-Pauline de Galard de Brassac de Béarn, Princesse de Broglie” by Jean Auguste Dominique Ingres

Greek artist and art director Dimitris Ladopoulos (previously) continues to use the Houdini algorithm, referred to as treemapping, to interpret paintings from the art history canon. The program calculates the density of information in a user-provided image and then divides it based on selected parameters, creating a pixelated effect that forms distinct color tiles of varying heights. In a statement about the project, Ladopoulos draws a comparison between treemapping and the original painter’s use of varied brushstrokes to bring fine detail, color variation, and texture to select areas of the canvas. You can see more of Ladopoulos’s work on Behance and Instagram.

Based on “Mona Lisa” by Leonardo da Vinci

Based on “Portrait of a Young Man” by Titian

Based on “Vincent van Gogh” by John Peter Russell

Based on “Young Woman with a Water Pitcher” by Johannes Vermeer

james_davis_nicoll: (Default)

Posted by Kate Sierzputowski

Bruno Pontiroli creates mind-bending explorations of the relationship between humans and animals, painting limber cows doing impressive handstands or an over-eager man embracing a large walrus, much to its chagrin. The artist shies away from labeling his work as Surrealist or Dadaist, instead proposing a new version of reality without categorization. Pontiroli will exhibit work with Galerie Klaus Kiefer at art KARLSRUHE from February 21 to 24, 2019 and with Fousion Gallery at Urvanity Art Madrid from February 28 to March 3, 2019. You can peek further inside Pontiroli’s bizarre world of shape-shifting humans and balancing bovines on his website and Instagram.

posted by [syndicated profile] digital_antiquarian_feed at 04:54pm on 15/02/2019

Posted by Jimmy Maher

From the time that Richard and Robert Garriott first founded Origin Systems in order to publish Ultima III, the completion of one Ultima game was followed almost immediately by the beginning of work on the next. Ultima VI in early 1990 was no exception; there was time only for a wrap party and a couple of weeks of decompression before work started on Ultima VII. The latter project continued even as separate teams made the two rather delightful Worlds of Ultima spinoffs using the old Ultima VI engine, and even as another Origin game called Wing Commander sold far more copies than any previous Ultima, spawning an extremely lucrative new franchise that for the first time ever made Origin into something other than The House That Ultima Built.

But whatever the source, money was always welcome. The new rival for the affections of Origin’s fans and investors gave Richard Garriott more of it to play with than ever before, and his ambitions for his latest Ultima were elevated to match. One of the series’s core ethos had always been that of continual technological improvement. Garriott had long considered it a point of pride to never use the same engine twice (a position he had budged from only reluctantly when he allowed the Worlds of Ultima spinoffs to be made). Thus it came as no surprise that he wanted to push things forward yet again with Ultima VII. Even in light of the series’s tradition, however, this was soon shaping up to be an unusually ambitious installment — indeed, by far the most ambitious technological leap that the series had made to date.

As I noted in my article on that game, the Ultima VI engine was, at least when seen retrospectively, a not entirely comfortable halfway point between the old “alphabet soup” keyboard-based interface of the first five games and a new approach which fully embraced the mouse and other modern computing affordances. Traces of the old were still to be found scattered everywhere amidst the new, and using the interface effectively meant constantly switching between keyboard-centric and mouse-centric paradigms for different tasks. Ultima VII would end such equivocation, shedding all traces of the interfaces of yore.

These screenshots from a Computer Gaming World preview of the game provide an interesting snapshot of Ultima VII in a formative state. The graphics are less refined than the final version, but the pop-up interface and the graphical containment model — more on that fraught subject later — are in place.

For the first time since Richard Garriott had discovered the magic of tile graphics in his dorm room at the University of Texas, the world of this latest Ultima was not to be built using that technique; Origin opted instead for a free-scrolling world shown from an overhead perspective, canted just slightly to convey the impression of depth. Gone along with the discrete tiles were the discrete turns of the previous Ultima games, replaced by true real-time gameplay. The world model included height — 16 possible levels of it! — as well as the other dimensions; characters could climb stairs to other floors in a building or walk up a hillside outdoors while remaining in the same contiguous space. In a move that must strike anyone familiar with the games of today as almost eerily prescient, Origin excised any trace of static onscreen interface elements. Instead the entire screen was given over to a glorious view of Britannia, with the interface popping up over this backdrop as needed. The whole production was designed with the mouse in mind first and foremost. Do you want your character to pick up a sword? Click on him to bring up his paper-doll inventory display, then drag the sword with the mouse right out of the world and into his hand. All of the things that the Ultima VI engine seemed like it ought to be able to do, but which proved far more awkward than anticipated, the Ultima VII engine did elegantly and effortlessly.

Looking for a way to reduce onscreen clutter and to show as much of the world of Britannia as possible at one time, Origin realized they could pop up interface elements only when needed. This innovation, seldom seen before, has become ubiquitous in the games — and, indeed, in the software in general — of today.

Origin had now fully embraced a Hollywood-style approach to game production, marked by specialists working within strictly defined roles, and the team which built Ultima VII reflected this. Even the artists were specialized. Glen Johnson, a former comic-book illustrator, was responsible for the characters and monsters as they appeared in the world. Michael Priest was the resident portrait artist, responsible for the closeups of faces that appeared whenever the player talked to someone. The most specialized artistic role of all belonged to Bob Cook, a landscape artist hired to keep the multi-level environment coherent and proportional.

Of course, there were plenty of programmers as well, and they had their work cut out for them. Bringing Garriott’s latest Ultima to life would require pushing the latest hardware right to the edge and, in some situations, beyond it. Perhaps the best example of the programmers’ determination to find a way at all costs is their Voodoo memory manager. Frustrated with MS-DOS’s 640 K memory barrier and unhappy with all of the solutions for getting around it, the programming team rolled up their sleeves and coded a solution of their own from scratch. It would force virtually everyone who played the game at its release to boot their machines from a custom floppy, and would give later users even more headaches; in fact, it would render the game unplayable on many post-early-1990s machines, until the advent of software emulation layers like DOSBox. Yet it was the only way the programming team could make the game work at all in 1992.

As usual for an Ultima, the story and structure of play evolved only slowly, after the strengths and limitations of the technology that would need to enable them were becoming clear. Richard Garriott began with one overriding determination: he wanted a real bad guy this time, not just someone who was misguided or misunderstood: “We wanted a bad guy who was really evil, truly, truly evil.” He envisioned an antagonist for the Avatar cut from the classic cloth of novelistic and cinematic villains, one who could stick around for at least the next few games. Thus was born the disembodied spirit of evil known as the Guardian, who would indeed proceed to dog the Avatar’s footsteps all the way through Ultima IX. One might be tempted to view this seeming return to a black-versus-white conception of morality as a step back for the series thematically. But, as Garriott was apparently aware, the moral plot twists of the previous two games risked becoming a cliché in themselves if perpetuated indefinitely.

Then too, while Ultima VII would present a story carrying less obvious thematic baggage than the last games, that story would be executed far more capably than any of those others. For, as the most welcome byproduct of the new focus on specialization, Origin finally hired a real writing team.

Raymond Benson and Richard Garriott take the stage together for an Austin theatrical fundraiser with a Valentines Day theme. Benson played his “love theme” from Ultima VII while Garriott recited “The Song of Solomon” — with tongue planted firmly in cheek, of course.

The new head writer, destined to make a profound impact on the game, was an intriguingly multi-talented fellow named Raymond Benson. Born in 1955, he was a native of Origin’s hometown of Austin, Texas, but had spent the last decade or so in New York City, writing, directing, and composing music for stage productions. As a sort of sideline, he’d also dabbled in games, writing an adventure for the James Bond 007 tabletop RPG and writing three text-adventure adaptations of popular novels during the brief mid-1980s heyday of bookware: The Mist, A View to a Kill, and Goldfinger. Now, he and his wife had recently started a family, and were tired of their cramped Manhattan flat and the cutthroat New York theater scene. When they saw an advertisement from Origin in an Austin newspaper, seeking “artists, musicians, and programmers,” Benson decided to apply. He was hired to be none of those things — although he would contribute some of his original music to Ultima VII — but rather to be a writer.

When he crossed paths with the rest of Origin Systems, Benson was both coming from and going to very different places than the majority of the staff there, and his year-long sojourn with them proved just a little uncomfortable. Benson:

It was like working in the boys’ dormitory. I was older than most of the employees, who were 95 percent male. In fact, I believe less than ten out of fifty or sixty employees were over thirty, and I was one of them. So, I kind of felt like the old fart a lot of times. Most of the employees were young single guys, and it didn’t matter to them if they stayed at the office all night, had barbecues at midnight, and slept in a sleeping bag until noon. Because I had a family, I needed to keep fairly regular 8-to-5 hours, which is pretty impossible at a games company.

A snapshot of the cultural gulf between Benson and the average Origin employee is provided by an article in the company’s in-house newsletter entitled “What Influences Us?” Amidst lists of “favorite fantasy/science fiction films” and “favorite action/adventure films,” Benson chooses his “ten favorite novels,” unspooling an eclectic list that ranges from Dracula to The Catcher in the Rye, Lucky Jim to Maia — no J.R.R. Tolkien or Robert Heinlein in sight!

Some of the references in Ultima VII feel like they just had to have come directly from the slightly older, more culturally sophisticated diversified mind of Raymond Benson. Here, for instance, is a riff on Black Like Me, John Howard Griffin’s landmark work of first-person journalism about racial prejudice in the United States.

It’s precisely because of his different background and interests that Benson’s contribution to Ultima VII became so important. Most of the writing in the game was actually dialog, and deft characterization through dialog was something his theatrical background had left him well-prepared to tackle. Working with and gently coaching a team consisting of four other, less experienced writers, he turned Richard Garriott’s vague story outline, about the evil Guardian and his attempt to seize control of Britannia through a seemingly benign religious movement known as the Fellowship, into the best-written Ultima ever. The indelible Ultima tradition of flagrantly misused “thees” and “thous” aside, the writing in Ultima VII never grates, and frequently sparkles. Few games since the heyday of Infocom could equal it. Considering that Ultima VII alone has quite possibly as much text as every Infocom game combined, that’s a major achievement.

The huge contributions made by Raymond Benson and the rest of the writing team — not to mention so many other artists, programmers, and environment designers — do raise the philosophical question of how much Ultima VII can still be considered a Richard Garriott game, full stop. From the time that his brother Robert convinced him that he simply couldn’t create Ultima V all by himself, as he had all of his games up to that point, Richard’s involvement with the nitty-gritty details of their development had become steadily less. By the early 1990s, we can perhaps already begin to see some signs of the checkered post-Origin career in game development that awaited him — the career of a basically good-natured guy with heaps of money, an awful lot of extracurricular interests, and a resultant short attention span. He was happy to throw out Big Ideas to set the direction of development, and he clearly still relished demonstrating Origin’s latest products and playing Lord British, but his days of fussing too much over the details were, it seems, already behind him by the time of Ultima VII. Given a choice between sitting down to make a computer game or throwing one of his signature birthday bashes or Halloween spook houses — or, for that matter, merely playing the wealthy young gentleman-about-town in Austin high society more generally — one suspects that Garriott would opt for one of the latter every time.

Which isn’t to say that his softer skill set wasn’t welcome in a company in transition, in which tensions between the creative staff and management were starting to become noticeable. For the people on the front line actually making Ultima VII, working ridiculous hours under intense pressure for shockingly little pay, Garriott’s talents meant much indeed. He would swoop in from time to time to have lunch catered in from one of Austin’s most expensive restaurants. Or he would tell everyone to take the afternoon off because they were all going out to the park to eat barbecue and toss Frisbees around. And of course they were always all invited to those big parties he loved to throw.

Still, the tensions remained, and shouldn’t be overlooked. Lurking around the edges of management’s attitude toward their employees was the knowledge that Origin was the only significant game developer in Austin, a fast-growing, prosperous city with a lot of eager young talent. Indeed, prior to the rise of id Software up in Dallas, they had no real rival in all of Texas. Brian Martin, a scripter on Ultima VII, remembers being told that “people were standing in line for our jobs, and if we didn’t like the way things were, we could just leave.” Artist Glen Johnson had lived in Austin at the time Origin hired him to work in their New Hampshire office, only to move him back to Austin once again when that office was closed; he liked to joke that the company had spent more money on his plane fare during his first year than on his salary.

The yin to Richard Garriott’s yang inside Origin was Dallas Snell, the company’s hard-driving production manager, who was definitely not the touchy-feely type. An Origin employee named Sheri Graner Ray recounts her first encounter with him:

My interviews at Origin Systems culminated with an interview with Dallas Snell. He didn’t turn away from his computer, but sort of waved a hand in the general direction of a chair. I hesitantly took a seat. Dallas continued to type for what seemed to me to be two or three hours. Finally, he stopped, swung around in his desk chair, leaned forward, put one hand on his knee and the other on his hip, narrowed his eyes at me, and said, “You’re here for me to decide if I LIKE you.” I was TERRIFIED. Well, I guess he did, cuz I got the job, but I spent the next year ducking and avoiding him, as I figured if he ever decided he DIDN’T like me, I was in trouble!

Snell’s talk could make Origin’s games sound like something dismayingly close to sausages rolling down a production line. He was most proud of Wing Commander and Savage Empire, he said, because “these projects were done in twelve calendar months or less, as compared to the twenty-to-thirty-month time frame that previous projects were developed in!” Martian Dreams filled six megabytes on disk, yet was done in “seven calendar months!!! Totally unprecedented!!” Wing Commander II filled 15 megabytes, yet “the entire project will have been developed in eight calendar months!!!” He concluded that “no one, absolutely no one, has done what we have, or what we are yet still capable of!!! Not Lucasfilm, not Sierra, not MicroProse, not Electronic Arts, not anyone!” The unspoken question was, at what cost to Origin’s staff?

It would be unfair to label Origin Systems, much less Dallas Snell alone, the inventor of the games industry’s crunch-time culture and its unattractive byproduct and enabler, the reliance on an endless churn of cheap young labor willing to let themselves be exploited for the privilege of making games. Certainly similar situations were beginning to arise at other major studios in the early 1990s. And it’s also true that the employees of Origin and those other studios were hardly the first ones to work long hours for little pay making games. Yet there was, I think, a qualitative difference at play. The games of the 1980s had mostly been made by very small teams with little hierarchy, where everyone could play a big creative role and feel a degree of creative ownership of the end product. By the early 1990s, though, the teams were growing in size; over the course of 1991 alone, Origin’s total technical and creative staff grew from 40 to 120 people. Thus companies like Origin were instituting — necessarily, given the number of people involved — more rigid tiers of roles and specialties. In time, this would lead to the cliché of the young 3D modeller working 100-hour weeks making trees, with no idea of where they would go in the finished game and no way to even find out, much less influence the creative direction of the final product in any more holistic sense. For such cogs in the machine, getting to actually make games (!) would prove rather less magical than expected.

Origin was still a long way from that point, but I fancy that the roots of the oft-dehumanizing culture of modern AAA game production can be seen here. Management’s occasional attempts to address the issue also ring eerily familiar. In the midst of Ultima VII, Dallas Snell announced that “the 24-hour work cycle has outlived its productivity”: “All employees are required to start the day by 10:00 AM and call it a day by midnight. The lounge is being returned to its former glory (as a lounge, that is, without beds).” Needless to say, the initiative didn’t last, conflicting as it did with the pressing financial need to get the game done and on the market.

Simply put, Ultima VII was expensive — undoubtedly the most expensive game Origin had ever made, and one of the most expensive computer game anyone had yet made. Just after its release, Richard Garriott claimed that it had cost $1 million. Of course, the number is comically low by modern standards, even when adjusted for inflation — but this was a time when a major hit might only sell 100,000 units rather than the 10 million or so of today.

Origin had first planned to release Ultima VII in time for the Christmas of 1991, an impossibly optimistic time frame (impossibly optimistic time frames being another trait which the Origin of the early 1990s shares with many game studios of today). When it became clear that no amount of crunch would allow the team to meet that deadline, the pressure to get it out as soon as possible after Christmas only increased. Looking over their accounts at year’s end, Origin realized that 90 percent of their revenue in 1991 had come through the Wing Commander franchise; had Wing Commander II not become as huge a hit as the first installment, they would have been bankrupt. This subsidizing of Ultima with Wing Commander was an uncomfortable place to be, and not just for the impact it might have had on Lord British’s (alter) ego. It meant that, with no major Wing Commander releases due in 1992, an under-performing Ultima VII could take down the whole company. Many at Origin were surprisingly clear-eyed about the dangers which beset them. Mike McShaffry, a programmer and unusually diligent student of the company’s financial situation among the rank and file — unsurprisingly, he would later become an entrepreneur himself — expressed his concern: “The road ahead for us is a bumpy one. Many companies do not survive the ‘boom town’ growth phase that we have just experienced.”

Thus when Ultima VII: The Black Gate — the subtitle was an unusually important one, given that Origin had already authorized a confusingly titled Ultima VII Part Two using the same game engine — shipped on April 16, 1992, the whole company’s future was riding on it.

Classic games, it seems to me, can be plotted on a continuum between two archetypes. At one pole are the games which do everything right — those whose designers, faced with a multitude of small and large choices, have made the right choice every time. Ultima Underworld, the spinoff game which Origin released just two weeks before Ultima VII, is one of these.

The other archetypal classic game is much rarer: the game whose designers have made a lot of really problematic choices, to the point that certain parts of it may be flat-out broken, but which nevertheless charms and delights due to some ineffable spirit that overshadows everything else. Ultima VII is the finest example of this type that I can think of. Its list of trouble spots is longer than that of many genuinely bad games, and yet its special qualities are so special that I can only recommend that you play it.

Inventory management in Ultima VII. It’s really, really hard to find anything, especially in the dark. Of course, I could fire up a torch… but wait! My torches are buried somewhere under all that mess in my pack.

Any list of that which is confusing, infuriating, or just plain boring in Ultima VII must start with the inventory-management system. The drag-and-drop approach to same is brilliant in conception, but profoundly flawed in execution. You need to cart a lot of stuff around in this game — not just weapons and armor and quest items and money and loot, but also dozens of pieces of food to keep your insatiable characters fed on their journeys and dozens or hundreds of magic reagents to let you cast spells. All of this is lumped together in your characters’ packs as an indeterminate splodge of overlapping icons. Unless you formulate a detailed scheme of exactly what should go where and stick to it with the rigidity of a pedant, you’ll sometimes find it impossible to figure out what you actually have and where it is on your characters’ persons. When that happens, you’ll have to resort to finding a clear spot of ground and laying out the contents of each pack on it one by one, looking for that special little whatsit.

Keys belong to their own unique circle of Inventory Hell. Just a few pixels big, they have a particular tendency to get hopelessly lost at the bottom of your pack along with those leftover leeks you picked up for some reason in the bar last night. Further, keys are distinguished only by their style and color — the game does nothing so friendly as tell you what door a given key opens, even after you’ve successfully used it — and there are a lot of them. So, you never feel quite confident when you can’t open a door that you haven’t just overlooked the key somewhere in the swirling chaos vortex that is your inventory. If you really love packing your suitcase before a big trip, you might enjoy Ultima VII‘s inventory management. Otherwise, you’ll find it to be a nightmare.

The combat system is almost as bad. Clearly Origin, to put it as kindly as possible, struggled to adapt combat to the real-time paradigm. While you can assemble a party of up to eight people, you can only directly control the Avatar himself in combat, and that only under a fairly generous definition of “control.” You click a button telling your people to start fighting, whereupon everyone, friend and foe alike, converges upon the same pixel as occasional words — “Aargh!,” “To arms!,” “Vultures will pick thy bones!” — float out of the scrum. The effect is a bit like those old Warner Bros. cartoons where Wile E. Coyote and the Road Runner disappear into a cloud of arms and legs until one of them pops out victorious a few seconds later.

The one way to change this dynamic also happens to be the worst possible thing you can do: equipping your characters with ranged weapons. This will cause them to open fire indiscriminately in the vague direction of the aforementioned pixel of convergence, happily riddling any foes and friends alike who happen to be in the way full of arrows. In light of this, one can only be happy that the Avatar is the only one allowed to use magic; the thought of this lot of nincompoops armed with fireballs and magic missiles is downright terrifying. Theoretically, it’s possible to control combat to some degree by choosing from several abstract strategies for each character, and to directly intervene with the Avatar by clicking specific targets, but in practice none of it makes much difference. By the time some of your characters start deciding to throw down all their weapons and hide in a corner for no apparent reason, you just shrug and accept it; it’s as explicable as anything else here.

You’ll learn to dread your party’s constant mewling for food, not least because it forces you to engage with the dreadful inventory system. (No, they can’t feed themselves. You have to hand-feed each one of them like a little birdie.)

Thankfully, nothing else in the game is quite as bad as these two aspects, but there are other niggling annoyances. The need to manually feed your characters is prominent among them. There’s no interest or challenge to collecting food. Even if you aren’t willing to blatantly steal it from every building you visit — something for which, unlike in Ultima IV, there are no consequences — there are lots of infallible but tedious means of collecting money to buy it. (Determined to be a good Avatar, I spent literally hours when I played the game marching back and forth from one end of the town of Britain to the other, buying meat cheap and selling it expensive, all so as to buy yet more meat to feed my hungry lot.) The need for food serves only to extend the length of a game that doesn’t need to be extended, and to do it in the most boring way possible.

But then, this sort of thing had always been par for the course with any Ultima, a series that always tended to leaven its inspired elements with a solid helping of tedium. And then too, Ultima had always been a little wonky when it came to its mechanics; Richard Garriott ceded that ground to Wizardry back in the days of Ultima I, and never really tried to regain it. Still, it’s amazing how poorly Ultima VII, a game frequently praised as one of the best CRPGs ever made, does as a CRPG, at least as most people thought of the genre circa 1992. Because there’s no interest or pleasure in combat, there’s no thrill to leveling up or collecting new weapons and armor. You have little opportunity to shape your characters’ development in any way, and those sops to character management that are present, such as the food system, merely annoy. Dungeons — many or most of them optional — are scattered around, but they’re fairly small while still managing to be confusing; the free-scrolling movement makes them almost impossible to map accurately on paper, yet the game lacks an auto-map. If you see a CRPG as a game in the most traditional sense of the word — as an intricate system of rules to learn and to manipulate to your advantage — you’ll hate, hate, hate Ultima VII for its careless mechanics. One might say that it’s at its worst when it actively tries to be a CRPG, at its best when it’s content to be a sort of Britannian walking simulator.

And yet I don’t dislike the game as much as all of the above might imply. In fact, Ultima VII is my third favorite game to bear the Ultima name, behind only Martian Dreams and the first Ultima Underworld. The reason comes down to how compelling the aforementioned walking simulator actually manages to be.

I’ve never cared much one way or the other about Britannia as a setting, but darned if Ultima VII doesn’t shed a whole new light on the place. At its best, playing this game is… pleasant, a word not used much in regard to ludic aesthetics, but one that perhaps ought to crop up more frequently. The graphics are colorful, the music lovely, the company you keep more often than not charming. It’s disarmingly engaging just to wander around and talk to people.

Underneath the pleasantness, not so much undercutting it is as giving it more texture, is a note of melancholy. This adventure in Britannia takes place many years after the Avatar’s previous ones, and the old companions in adventure who make up his party are as enthusiastic as ever, but also a little grayer, a little more stooped. Meanwhile other old friends (and enemies) from the previous games are forever waiting in the wings for one last cameo. If a Britannia scoffer like me can feel a certain poignancy, it must be that much more pronounced for those who are more invested in the setting. Today, the valedictory feel to Ultima VII is that much affecting because we know for sure that this is indeed the end of the line for the classic incarnation of Britannia. The single-player series wouldn’t return there until Ultima IX, and that unloved game would alter the place’s personality almost beyond recognition. Ah, well… it’s hard to imagine a lovelier, more affectionate sendoff for old-school Britannia than the one it gets here.

The writing team loves to flirt with the fourth wall. Fortunately, they never quite take it to the point of undermining the rest of the fiction.

Yet even as the game pays loving tribute to the Britannia of yore, there’s an aesthetic sophistication about it that belies the series’s teenage-dungeonmaster roots. It starts with the box, which, apart from the title, is a foreboding solid black. The very simplicity screams major statement, like the Beatles’ White Album or Prince’s Black Album. Certainly it’s a long way from the heaving bosoms and fire-breathing dragons of the typical CRPG cover art.

When you start the game, you’re first greeted with a title screen that evokes the iconic opening sequence to Ultima IV, all bright spring colors and music that smacks of Vivaldi. But then, in the first of many toyings with the fourth wall, the scene dissolves into static, to be replaced by the figure of the Guardian speaking directly to you.

As you wander through Britannia in the game proper, the Guardian will continue to speak to you from time to time — the only voice acting in the game. His ominous presence is constantly jarring you when you least expect it.

The video snippet below of a play within the play, as it were, that you encounter early in the game illustrates some more of the depth and nuance of Ultima VII‘s writing. (Needless to say, this scene in particular owes much to Raymond Benson’s theatrical background.)

This sequences offers a rather extraordinary layer cake of meanings, making it the equal of a sophisticated stage or film production. We have the deliberately, banally bad play put on by the Fellowship actors, with its “moon, June, spoon” rhyme sequences. Yet peeking through the banality, making it feel sinister rather than just inept, is a hint of cult-like menace. Meanwhile the asides of our companions tell us not only that the writers know the play is bad, but that said companions are smart enough to recognize it as well. We have Iolo’s witty near-breaking of the fourth wall with his comment about “visual effects.” And then we have Spark’s final verdict on the passion play, delivered as only a teenager can: “This is terrible!” (For some reason, that line makes me laugh every time.) No other game of 1992, with the possible exception only of the text adventure Shades of Gray, wove so many variegated threads of understanding into its writing. Nor is the scene above singular. The writing frequently displays the same wit and sophistication as what you see above. This is writing by and for adults.

The description of Ultima VII‘s writing as more adult than the norm also applies in the way in which the videogame industry typically uses that adjective. There’s a great extended riff on the old myths of unicorns and virgins. The conversation with a horny unicorn devolves into speculation about whether the Avatar himself is, shall we say, fit to ride the beast…

For all of the cutting-edge programming that went into the game, it really is the writing that does the bulk of the heavy lifting in Ultima VII. And it’s here that this early million-dollar computer game stands out most from the many big-budget productions that would follow it. Origin poured a huge percentage of that budget not into graphics or sound but into content in its purest form. If not the broadest world yet created for a computer at the time of the game’s release, this incarnation of Britannia must be the deepest and most varied. Nothing here is rote; every character has a personality, every character has something all her own to say. The sheer scale of the project which Raymond Benson’s team tackled — this game definitely has more words in it than any computer game before it — is well-nigh flabbergasting.

Further, the writers have more on their minds than escapist fantasy. They use the setting of Britannia to ponder the allure of religious cults, the social divide between rich and poor, and even the representation of women in fantasy art, along with tax policy, environmental issues, and racism. The game is never preachy about such matters, but seamlessly works its little nuggets for thought into the high-fantasy setting. Ultima VII may lack the overriding moral message that had defined its three predecessors, but that doesn’t mean it has nothing to say. Indeed, given the newfound nuance and depth of the writing, the series suddenly has more to say here than ever before.

Because of how much else there is to see and do, the main plot about the Guardian sometimes threatens to get forgotten entirely. But it’s enjoyable enough as such things go, even if its main purpose often does seem to be simply to give you a reason to wander around talking to people. In the second half of the game, the plot picks up steam, and there are a fair number of traditional CRPG-style quests to complete. (There are also more personal “quests” among the populaces of the towns you visit, but they’re largely optional and hardly earth-shattering. They are, however, often disarmingly sweet-natured: getting the shy lovelorn fellow together with the girl he worships from afar… that sort of thing.) The game as a whole is very soluble as long as you take notes when you’re given important information; there’s no trace of a quest log here.

While a vocal minority of Ultima fandom decries this seventh installment for the perfectly justifiable reasons I mentioned earlier in this article, the majority laud it as — forgive the inevitable pun! — the ultimate incarnation of what Richard Garriott began working toward in the late 1970s. Even with all of its annoying aspects, it’s undoubtedly the most accessible Ultima for the modern player, what with its fairly intuitive mouse-driven interface, its reasonably attractive graphics and sound, and its relatively straightforward and fair main quest. Meanwhile its nuanced writing and general aesthetic sophistication are unrivaled by any earlier game in the series. If it’s not the most historically important of the main-line Ultima games — that honor must still go to the thematically groundbreaking Ultima IV — it’s undoubtedly the one most likely to be enjoyed by a player today.

Indeed, it’s been called the blueprint for many of the most popular epic CRPGs of today — games where you also spend much of your time just walking around and talking to a host of more or less interesting characters. That influence can easily be overstated, but that doesn’t mean there isn’t something to the claim. No other CRPG in 1992, or for some thereafter, played quite like this one, and Ultima VII really does have at least as much in common with the CRPGs of today as it does with its contemporaries. On the whole, then, its hallowed modern reputation is well-earned.

Richard Garriott (far left) and the rest of the Ultima VII team toast the game’s release at Britannia Manor, the former’s Austin mansion.

Its reception in 1992, on the other hand, was far more mixed than that reputation might suggest. Questbusters magazine, deploying an unusually erudite literary comparison of the type of which Raymond Benson might have approved, called it “the Finnegans Wake of computer gaming — a flawed masterpiece,” referring to its lumpy mixture of the compelling and the tedious. Computer Gaming World‘s longtime adventure reviewer Scorpia had little good at all to say about it. Perhaps in response to her negativity, the same magazine ran a second, much more positive review from Charles Ardai in the next issue. Nevertheless, he began by summing up the sense of ennui that was starting to surround the whole series for many gamers: “Many who were delighted when Ultima VI was released can’t be bothered to boot up Ultima VII, as though it goes without saying that the seventh of anything can’t possibly be any good. The market suddenly seems saturated; weary gamers, sure that they have played enough Ultima to last a lifetime, eye the new Ultima with suspicion that it is just More Of The Same.” Even at the end of his own positive review, written with the self-stated goal of debunking that judgment, Ardai deployed a counter-intuitive closing sentiment: “After seven Ultimas, it might be time for Lord British to turn his sights elsewhere.”

Not helping the game’s reception were all of the technical problems. It’s all too easy to forget today just how expensive it was to be a computer gamer in the early 1990s, when the rapid advancement of technology meant that you had to buy a whole new computer every couple of years — or less! — just to be able to play the latest releases. More so even that its contemporaries, Ultima VII pushed the state of the art in hardware to its limit, meaning that anyone lagging even slightly behind the bleeding edge got to enjoy constant disk access, intermittent freezes of seconds at a time, and the occasional outright crash.

And then there were the bugs, which were colorful and plentiful. Chunks of the scenery seemed to randomly disappear — including the walls around the starting town of Trinsic, thus bypassing the manual-lookup scheme Origin had implemented for copy protection. A plot-critical murder scene in another town simply never appeared for some players. Even worse, a door in the very last dungeon refused to open for some; Origin resorted to asking those affected to send their save file on floppy disk to their offices, to be manually edited in order to correct the problem and sent back to them. But by far the most insidious bug — one from which even the current edition of the game on digital-download services may not be entirely free — were the keys that disappeared from player’s inventories for no apparent reason. Given what a nightmare keeping track of keys was already, this felt like the perfect capstone to a tower of terribleness. (One can imagine the calls to Origin’s customer support: “Now, did you take all of the stuff out of all of your packs and sort it out carefully on the ground to make sure your key is really missing? What about those weeks-old leeks down there at the bottom of your pack? Did you look under them?”) Gamers had good cause to be annoyed at a product so obviously released before its time, especially in light of its astronomical $80 suggested retail price.

A Computer Gaming World readers’ poll published in the March 1993 issue — i.e., exactly one year after Ultima VII‘s release — saw it ranked as the respondents’ 30th favorite current game, not exactly a spectacular showing for such a major title. Wing Commander II, by way of comparison, was still in position six, Ultima Underworld — which was now outselling Ultima VII by a considerable margin — in a tie for third. It would be incorrect to call Ultima VII a flop, or to imply that it wasn’t thoroughly enjoyed by many of those who played it back in the day. But for Origin the facts remained when all was said and done that it had sold less well than either of the aforementioned two games after costing at least twice as much to make. These hard facts contributed to the feeling inside the company that, if it wasn’t time to follow Charles Ardai’s advice and let sleeping Ultimas lie for a while, it was time to change up the gameplay formula in a major way. After all, Ultima Underworld had done just that, and look how well that had worked out.

But that discussion, of course, belongs to history. In our own times, Ultima VII remains an inspiring if occasionally infuriating experience well worth having, even if you don’t normally play CRPGs or couldn’t care less about the lore of Britannia. I can only encourage all of you who haven’t played it before to remedy that while you wait for my next (and last) article about the game, which will look more closely at the Fellowship, a Britannian cult with an obvious Earthly analogue.

(Sources: the book Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector; Origin Systems’s internal newsletter Point of Origin dated August 7 1991, October 25 1991, December 20 1991, February 14 1992, February 28 1992, March 13 1992, April 20 1992, and May 22 1992; Questbusters of July 1991 and August 1992; Computer Gaming World of April 1991, October 1991, August 1992, September 1992, and March 1993; Compute! of January 1992; online sources include The Ultima Codex interviews with Raymond Benson and Brian Martin, a vintage Usenet interview with Richard Garriott, and Sheri Graner Ray’s recollections of her time at Origin on her blog.

Ultima VII: The Black Gate is available for purchase on GOG.com. You may wish to play it using Exult instead of the original executable. The former is a free re-implementation of the Ultima VII engine which fixes some of its worst annoyances and is friendly with modern computers.)

Posted by Kate Sierzputowski

You’ve seen the perfect arcs of boiling water solidified mid-throw, and perhaps this frozen speeding sign that duplicated itself over 2019’s Polar Vortex, but have you seen ghost apples? Thanks to a Facebook post by farm manager Andrew Sietsma, the phenomenon has captivated the internet, leaving commenters to marvel at the sight of these glass-like specimens that remain after apples have rotted from their icy exterior. Sietsema told CNN that this winter the weather in western Michigan was “just cold enough that the ice covering the apple hadn’t melted yet, but it was warm enough that the apple inside turned to complete mush (apples have a lower freezing point than water).” Jonagolds are one of Sietsema’s favorite apple varieties, but on the farm they are now referred to as “Jonaghosts.” (via Reddit and Bored Panda)

Posted by Robin D. Laws

In the latest episode of our well-compassed podcast, Ken and I talk LARP TV, life before maps, word clusters and a disappearing airman.
posted by [syndicated profile] in_the_pipeline_feed at 02:11pm on 15/02/2019

Posted by Derek Lowe

I’ve written here about what I referred to as “nationalist science”, in that case actions by the Hungarian government against its own universities and the Chinese government’s vigorous promotion of traditional medicine. Now we can (unfortunately) add another one to the list. The Hindu nationalist movement in India has been moving into science and medicine in recent years, making claims about ancient discoveries and remedies that are completely unfounded but appeal to their supporters.

This article at Science will get you up to speed, most unenjoyably. There was an incident last month at the Indian Science Congress where a chemist, vice-chancellor of Andhra University yet, made the claim that ancient Hindus has been doing research in stem cell technology based on a tale from the Mahabharata. You know, back in 1972 I was more skeptical as a ten-year-old reading those Erich von Däniken paperbacks which made similar claims, so it’s not very encouraging to see this stuff showing up in 2019. In fact, from the looks of it, some of these folks are citing the exact same verses in the ancient epics, and why the hell not, I guess.

Problem is, this is not some lone crank:

Some blame the rapid rise at least in part on Vijnana Bharati (VIBHA), the science wing of Rashtriya Swayamsewak Sangh (RSS), a massive conservative movement that aims to turn India into a Hindu nation and is the ideological parent of Modi’s Bharatiya Janata Party. VIBHA aims to educate the masses about science and technology and harness research to stimulate India’s development, but it also promotes “Swadeshi” (indigenous) science and tries to connect modern science to traditional knowledge and Hindu spirituality.

VIBHA receives generous government funding and is active in 23 of India’s 29 states, organizing huge science fairs and other events; it has 20,000 so-called “team members” to spread its ideas and 100,000 volunteers—including many in the highest echelons of Indian science.

The former head of Indian defense research, for example, says that he firmly believes in the powers of gemstones to influence human health. Narenda Modi himself claimed a few years ago that the transplantation of the god Ganesh’s elephant head onto a human was an example of outstanding ancient Hindu surgical techniques. And if that sort of thing doesn’t make you want to bury your head in your hands, try this:

Critics say pseudoscience is creeping into science funding and education. In 2017, Vardhan decided to fund research at the prestigious Indian Institute of Technology here to validate claims that panchagavya, a concoction that includes cow urine and dung, is a remedy for a wide array of ailments—a notion many scientists dismiss. And in January 2018, higher education minister Satya Pal Singh dismissed Charles Darwin’s evolution theory and threatened to remove it from school and college curricula. “Nobody, including our ancestors, in written or oral [texts], has said that they ever saw an ape turning into a human being,” Singh said.

Excellent. The first time I remember hearing that one was from Mr. Smith, an elderly man who lived next door to us in my small Arkansas town in the late 1960s. He had the exact same line about apes and humans, and went on to inform me that moon landing program was a hoax and that dinosaurs never existed (“Just a bunch of old bones they stuck together”) As a six-year-old fan of NASA and defender of the honor of dinosaurs, these claims did not go over well with me. My 1968 visions of what the world would be like in fifty years tended towards space travel and flying cars, and most definitely did not include national ministers of science taking the side of Mr. Smith.

Needless to say, India has produced great scientists (Hindu and otherwise) who have done great work: Bose, Raman, Chandrasekhar, Ramanujan, Khorana and many more. But the country’s scientific record is dishonored and mocked by this sort of thing. There are many prominent Indian researchers speaking out against these idiotic statements, and I support them wholeheartedly. Science in general is dishonored by attempting to impose nationalist or religious criteria on top of its underlying principles. Those principles? To find out the truth about the natural world, to validate it by careful and repeated experiment, to build on that knowledge wherever it may lead. To understand physical reality, in other words, to work with it as it is and not to play games by believing only what it makes us feel good to believe.

andrewducker: (Default)
rydra_wong: The BBC's error 500 page, showing the test card clown surrounded by flames. (error fire clown)
posted by [personal profile] rydra_wong at 11:33am on 15/02/2019 under
Led By Donkeys are releasing all their artwork for people to do with what they will -- put it on t-shirts, do some guerilla postering, make beer mats and furtively leave them in Wetherspoons -- so if you'd like to help publicize these notable Tweets and public statements by Brexiteers which they'd rather people weren't reminded of, you can:


They've now put up over 150 billboards and have a new stretch goal so people can keep giving them money:

supergee: (starmaker)
posted by [personal profile] supergee at 05:45am on 15/02/2019 under
rydra_wong: Lee Miller photo showing two women wearing metal fire masks in England during WWII. (Default)
posted by [personal profile] rydra_wong at 10:47am on 15/02/2019
So that icon meme, where I was trying to explain my default icon --

So it's a staged photo, probably using professional fashion models but in the setting of a real air raid shelter and real fire masks (and everyone in London was living in the context of the Blitz at that point), and I think it derives some of its zing from that. Whether you know the women are models or not, they're stylish, casual, a little jaunty; the woman on the right is dangling the whistle thoughtfully from one manicured hand. With those uncanny masks obscuring their faces. I love it so much.

supergee: (thinking)
posted by [personal profile] supergee at 05:34am on 15/02/2019 under
posted by [syndicated profile] dotaturls_feed at 12:52am on 15/02/2019
posted by [syndicated profile] tilesorstuds_feed at 12:14am on 15/02/2019

Posted by Kaplan

This awesome looking, movie accurate UCS Style Speeder MOC was shared by flickr.com user and LEGO fan Aniomylone who is known for his great UCS style Star Wars MOC's like UCS Hailfire DroidUCS Sith Infiltrator and many more. Rey's Speeder from Star Wars: Episode VII - The Force Awakens is a very detailed and movie accurate model with perfect angles and color-scheme. I really like the inclusion of metal scraps in the net on the side of the vehicle. The data-sheet looks also very cool and resembles the ones that come with official sets.
siderea: (Default)
posted by [personal profile] siderea at 03:53am on 15/02/2019 under
Imagine being one of the last hundred speakers of your language in all the world, and, having been asked by one of your elders to learn the songs of your people, deciding to try to save them by making an album of them so beautiful that all the world would hear it, and come to treasure them.

Via Metafilter: Jeremy Dutcher's album "Wolastoqiyik Lintuwakonawa" [Youtube playlist]. A shockingly beautiful and heartbreakingly heroic work of cultural preservation and propagation. From Metafilter:
Jeremy Dutcher is a First Nations classically trained tenor, musician, and composer whose debut album Wolastoqiyik Lintuwakonawa, sung entirely in the "severely endangered" language of Wolastoqey, won the 2018 Polaris Prize, which is awarded annually to the best full-length Canadian music album. [...] Dutcher is a Wolastoqiyik (Maliseet) member of the Tobique First Nation, and his album is based on traditional Wolastoqiyik songs, often sampling century-old wax cylinder recordings of his ancestors' singing, to devastatingly beautiful effect.
Glorious, glorious. Highly recommended.

Posted by Kaplan

As a fan of the LEGO Brickheadz line I really appreciate high quality custom Brickheadz MOC's. Flickr user and LEGO fan Michael Jasper shared this awesome Lady Liberty MOC in his photostream. The attention to the detail is really great and I especially like the perfectly usage of two different grill pieces. Addition of the 2x1 printed tile piece that comes with the Lady Liberty minifigure from the Collectible minifigure series 6 is also a good touch.

Posted by Jonathan Jonas

This is the first in a series of articles reviewing several products to add lighting to your LEGO Models. In the coming weeks, we will summarize what we’ve learned in a LEGO Lighting Guide (similar to our popular LEGO Storage Guide.)

It’s a lot of fun to build a LEGO set or a custom LEGO creation. Once it’s perfect, you might want to show it off in your home, display it at a LEGO convention, or share photos of your amazing creation online. Lighting is one of the best way to draw attention to your creation and make it stand out.

I realized that I needed to add some lighting when displaying some of my creations at a Christmas event for a local children’s hospital charity where our LUG shows off a Town and Train display. I wanted to try adding inexpensive LED Christmas light strings to a few of my Modulars, so I bought a selection of battery-operated LED light strings in after-christmas sales. Of the strings I purchased, some used ribbon wire and others used normal wires to run power to the lights.

Three different generic battery-powered LED Christmas Lights.

Three different generic battery-powered LED Christmas Lights.

The problem with LED Christmas Lights is that they are not designed for use with LEGO. As you can see, each string has its own battery pack, so if you want to use different types of lighting, or incorporate multiple strings of lights, you need to fit multiple battery boxes inside your LEGO model.

While most products do not offer any lighting effects, some allow the ehitre strand to flash (but who needs an entire string of LEDs flashing in their model?) Some of these strings let you cut the wires to shorten them, but that’s about as far you can go to modify these strings without breaking out a soldering iron. (While I took an electronics class in High School, I don’t really want to cannibalize a string of LEDs.)

First floor of Parisian Cafe, with ribbon lighting tucked into a groove attached to the ceiling.

First floor of Parisian Cafe, with ribbon lighting tucked into a groove attached to the ceiling.

If you are planning on buying LED Christmas lights, bring some batteries and a technic piece with you—not all strings fit through a Technic hole. You also need to consider how easily you can bend the wires to fit within your model. Ribbon wiring is nice and thin, so you might be able to fit it between bricks. The ribbon bends nicely in one direction, but it won’t bend at all the other direction. Traditional wiring can be bent in both directions, but is thicker.

With the limitations of your lighting product in mind, you have to figure out how to run the wires through your set, and where to put the battery pack Even though the lights are inexpensive and small, the wiring isn’t as thin as that of LEGO specific light kits from 3rd party sellers.

Lighting for #10243 Parisian Cafe

In the following example, you can see that I’ve added a groove near the ceiling containing the lights (also called cove lighting) and built out the fire place so I can sneak the ribbon up. From there, I placed an LED behind the fireplace and continued to run the ribbon up for more cove lighting on the 2nd floor. Since the roof opens up on the 3rd floor for playability, I created a vertical light bar for that level.

Parisian Cafe with Cove Lighting and Vertical Light Bar.

Parisian Cafe with Cove Lighting and Vertical Light Bar.

Another big consideration for me is playability. Running a single strand of lights all the way through the model makes it harder to open the model back up again. I didn’t like this as I want to open it up. I went through the model redesigning the floors, ceiling and walls to sneak that cable through while still allowing it come apart. The third floor in this example completely comes off so you can see the 2nd floor, but the second floor ceiling is now tied up with lighting and while I can get the 1st and second apart, it’s not completely separated because there is limited distance for the ribbon to shift around.

Parisian Cafe, Front and Rear with Lighting.

Parisian Cafe, Front and Rear with Lighting.

Overall, I was pretty happy with the lighting, but you’ll notice the street lamp and the hanging lights along the front of the building are not lit. Running lighting to them is impractical with this method of lighting your model, which is where a pre-made kit might be worth the extra money. Some kits come with lights for those and features like a flickering fireplace light. While this is definitely an economical option, the biggest downsides were the ugly battery box, and some restrictions on opening the modular up afterwards.

Lighting for #10260 Downtown Diner

Most of these lighting products fit into a Technic hole, with about one studs worth of space needed for the wiring and the back of the LED bulb. With some skill, you can fit the light into the hole, run the cables back and forth and put a cover on the top to keep it all in place. This actually works fairly well, but the lighting cove is 5 plates high, which really starts to eat into the interior space.

Downtown Diner, Cove lighting in the 3rd floor recording studio.

Downtown Diner, Cove lighting in the 3rd floor recording studio.

You can really see how much space is taken up in the photo of the recording studio (on the 3rd floor of the Downtown Diner.) Similarly, on the first floor of the diner the cove interfered with the framed records on the walls. This resulted in moving the pictures down so the lighting run wasn’t running over the top of them.

To fully light the Diner, I ended up using 2 strands which meant 2 battery packs. I decided since this was a 50’s style diner to build out a smokehouse in the back and run a chimney up to the roof. This gave me space to hide the batteries, as well as a way to shuttle the wires up from floor to floor and access to the back of the juke box to add four LEDs to that. A small side door and a regular door gives access to both on/off switches.

Downtown Diner with Lights (left), and back of Diner showing smokehouse built to cover the battery packs (right.)

Downtown Diner with Lights (left), and back of Diner showing smokehouse built to cover the battery packs (right.)

The first string is a multi-colored strand which I ran to the diner to give it that sort of fun vibe. In a darker room, the colors give a nice ambiance to the diner. The other floors have a standard white light string. Unfortunately, some of the neat things I wanted lit up like the big 2-story window are very hard to work into the design without completely redesigning the building. While I didn’t mind adding in an extension on the back to hide the batteries on the Diner or expanding the fireplace on the Cafe, I wasn’t really wanting to totally redesign that beautiful large picture window. Here again is where the kits are far superior to the off the shelf Christmas lights.


As you can see, the lights are so dim that I could barely take a photo in a fairly dark room. Even with fresh batteries, these lights mostly disappear in a well-lit space.

Battery-powered LED Christmas lights are not very bright.

Battery-powered LED Christmas lights are not very bright.

In the end, I was happy with the lighting, but I bought all three strings of lights on clearance for less than 6 dollars. Unfortunately, when compared to lighting solutions designed for LEGO, they pale in comparison! Battery powered LED Christmas lights earn our “Acceptable” (2/5 star) rating—only the very low cost saved them from our lowest rating!

In future reviews, we will look at products designed specifically for adding lighting to your LEGO models.

The post ʀᴇᴠɪᴇᴡ: Battery-powered LED Christmas Lights appeared first on BRICK ARCHITECT.

Posted by Mary Ann O'Donnell

Why Singleton Lunch? Why invite someone to Handshake 302, have them prepare a meal, share it with a group of friends and strangers, and call it “art”? What’s the difference between a Singleton Lunch meal and more traditional forms of art like painting or theater or even a happening?

Simply put, the difference lies in the goals of the artwork. Traditional art aims to produce different responses in its audience. Painting, for example, aims to produce an appreciation of beauty (or its opposite), while drama brings its audience through an emotional experience in order to share something about the human experience. During a happening, the artist brings our attention to the human body as a vehicle for aesthetic expression. In contrast, art events like Singleton Lunch belong to a category of art called “relational aesthetics.”

Curator Nicolas Bourriaud introduced the term “relational aesthetics” to describe an art trend that he found more prevalent in the digital age. The 1990s saw the rise of the internet and increasing isolation of people from one another. Before the rise of digital technology, most people completed everyday tasks through a series of human interactions. We shopped at local markets and talked with the shopkeeper, while at work we interacted with colleagues during breaks and meals in the canteen. However, with the rise of the internet, many of us now shop online, work from home, and play with virtual friends in virtual worlds. In other words, relational aesthetics is response to our increasing isolation from each other.

Importantly, relational aesthetics calls attention to lived loneliness, rather than actual independence. In our everyday lives, we continue to depend on other people. Farmers produce our food, workers assemble our furniture and cell phones, and urban planners design the complex cities in which we live. However, the internet and digital technology have increasingly mediated these relationships creating the mistaken idea that we are “alone” despite being surrounded by millions of other urban residents.

In Shenzhen, this feeling of being alone in a crowd is common. Who has not heard a relative or friend complain that our city “lacks human feeling.” In part, these feelings of loneliness are part of industrial urbanization. As early as 1893, French sociologist Émile Durkheim had already introduced the term “anomie” to describe the lack of moral guidance and sense of social belonging that many rural migrants experienced upon leaving their homes to find work in urban factories. Durkheim emphasized that when a person is cut off from family, friends, and familiar context often led to unsocial feelings of anxiety, anger, and despair, which in turn led to social unrest, crime, and often suicide.

At this basic level, the sense of loneliness that many experience after migrating to Shenzhen has been a normal (if unwelcome) aspect of urbanization since the late 19th century. At the same time, the rise of the internet has intensified this aspect of modern urban life. In modern cities today, migrants are not only cut off from hometown family and friends, but also from neighbors, shopkeepers, and even co-workers. Just take a look at commuters on the metro. Sometimes, more than twenty strangers can be packed together, arms and legs touching, but no one is talking to each other. Instead, we are looking at our cellphones. In the best case, we are simply ignoring the people in our metro car. In the worst case, we resent them for crowding us, coughing, or even breathing too loudly!

Relational art like Singleton Lunch attempts to ameliorate this situation through artwork that creates relationships. The completion of the artwork not only relies on the participation of different people, but also succeeds to the extent that participants are willing to break down the habitual barriers between people. Of course, Claire Bishop has criticized the kinds of relationships that this art produces, arguing that relational artworks connect people with similar backgrounds. Indeed, she suggests that relational artwork confirms our own biases as we end up talking with people who share our interests and social views, rather than exposing us to the lived heterogeneity of every city on the planet.

All this to say, life in a migrant city like Shenzhen presents its residents with interrelated but different conundrums.

On the one hand, as human beings we need relationships in order to thrive. It is not enough for one person to come to Shenzhen and simply work, go home, play on a cellphone, go to bed, and then wake up the next morning for more of the same. This kind of routine is antithetical to our natures; we yearn for connection and laughter and the security of belonging. Arguably, the popularity of pets in Shenzhen shows that we are making relationships with members of other species in order to compensate for the relative superficiality of our human relationships. In this sense, relational art that brings like minded people together helps ease the sense of isolation that many of us feel, helping the city as people (rather than as buildings) to take root and grow.

On the other hand, the complex heterogeneity in a global megacity forces us to figure out how to address and navigate all this diversity. In Shenzhen, society comprises  millions of people from different eras, different hometowns, different classes, and different professions. How can small scale interventions like Singleton Lunch help ameliorate the vast distances that seem to separate us, especially when we are seated next to each other and dare not say hello?

Perhaps, it is necessary to acknowledge that each of us can only offer small scale interventions. Over the course of a busy day, how many people can we genuinely get to know? One? Or does it take years of a life, carefully cultivating different relationships and accepting the fact that although human beings need human relationships to flourish, we can not force ourselves to easily make friends. A three-hour lunch with several strangers and several friends may be the right scale for feeling comfortable and taking a chance in learning about someone else’s experience.

We thank everyone who willingly opened their heart to come and listen to different stories. Over two and 1/2 months, we met people from Korea and Portugal, Sichuan and Taiwan. We talked about urban villages and living with parents after college. We ate many delicious foods, although truth be told that pot of seafood congee was the most joyful surprise of the series. And we realized that even if small scale events like Singleton Lunch can not transform the social problems created by immigration and urbanization, nevertheless, these activities do offer a means for finding companions in figuring out a way forward.

siderea: (Default)
posted by [personal profile] siderea at 12:53am on 15/02/2019 under , ,
So I'm packing (and washing, but not in that order) my CDs (or rather their jewel cases) and discovered a whole little cache of CDs I had little to recollection of, which aren't ripped to my computer. And then I remembered that my laptop's CD/DVD drive was out of order for a while, and I wound up with a backlog of media that I had either purchased or been gifted that I had no way to play. By the time I got the device straightened out, I had forgotten I had them.

It's like Christmas!

I'm not ripping them all now, but maybe I can take some time on unpacking to get it done.

Tangentially related: I have been saving almost every box from every delivery for about a year, in anticipation of moving. Turns out the box that my nifty new boots were shipped in is almost exactly the size of my remaining CD collection. (Of course, now that they're all packed, I'm now going to find another little cache of CDs somewhere in my apartment. Let me enjoy my illusions for the moment.)
yhlee: red and black tentacle heart pendant (tentacle heart)
posted by [personal profile] yhlee at 11:19pm on 14/02/2019 under
I will have to peruse the rest of the archive at a later point as I'm flying out to Dallas-Ft. Worth at ass o' clock tomorrow morning, but I received two lovely, completely G-rated gifts for [community profile] chocolateboxcomm!

The Purpose of Witch's Cats (0 words) by Anonymous
Chapters: 1/1
Fandom: Star Trek: Discovery
Rating: General Audiences
Warnings: No Archive Warnings Apply
Relationships: Michael Burnham & Philippa Georgiou
Characters: Michael Burnham, Philippa Georgiou, Original Cat Character - Character
Additional Tags: Halloween Costumes, Fanart, Drawing, Pre-Canon

Michael Burnham is spending Halloween with Philippa Georgiou. As they hand out candy to trick-or-treaters, Michael is trying to figure out the purpose of dressing up. So far she's only sold on cats.

This is chibi and pure adorable. The huge eyes! Georgiou with her jack o' lantern bag o' candy! Michael and the black cat!

Sketchbook #37 (0 words) by Anonymous
Chapters: 1/1
Fandom: Vorkosigan Saga - Lois McMaster Bujold
Rating: General Audiences
Warnings: No Archive Warnings Apply
Relationships: Aral Vorkosigan/Ges Vorrutyer
Characters: Ges Vorrutyer
Additional Tags: Sketches

Aral tries to draw Ges.

A great pencil sketch of Aral (well, his hand) drawing Ges Vorrutyer! Lovely delineation of form and a loose, relaxed style.
Mood:: 'eee!' eee!

Posted by cks

Recently, I tweeted:

I probably shouldn't be surprised that a Thunderbolt 10G-T Ethernet adapter can do real bidirectional 10G on my Fedora laptop (a Dell XPS 13), but I'm still pleased.

(I am still sort of living in the USB 2 'if it plugs in, it's guaranteed to be slow' era.)

There are two parts to my pleasant surprise here. The first part is simply that a Thunderbolt 3 device really did work fast, as advertised, because I'm quite used to nominally high-speed external connection standards that do not deliver their rated speeds in practice for whatever reason (sometimes including that the makers of external devices cannot be bothered to engineer them to run at full speed). Having a Thunderbolt 3 device actually work feels novel, especially when I know that Thunderbolt 3 basically extends some PCIe lanes out over a cable.

(I know intellectually that PCIe can be extended off the motherboard and outside the machine, but it still feels like magic to actually see it in action.)

The second part of the surprise is that my garden variety vintage 2017 Dell XPS 13 laptop could actually drive 10G-T Ethernet at essentially full speed, and in both directions at once. I'm sure that some of this is in the Thunderbolt 3 10G-T adapter, but still; I'm not used to thinking of garden variety laptops as being that capable. It's certainly more than I was hoping for and means that the adapter is more useful than we expected for our purposes.

This experience has also sparked some thoughts about Thunderbolt 3 on desktops, because plugging this in to my laptop was a lot more pleasant an experience than opening up a desktop case to put a card in, which is what I'm going to need to do on my work desktop if I need to test a 10G thing with it someday. Unfortunately it's not clear to me if there even are general purpose PC Thunderbolt 3 PCIe cards today (ones that will go in any PCIe x4 slot on any motherboard), and if there are, it looks like they're moderately expensive. Perhaps in four or five years, my next desktop will have a Thunderbolt 3 port or two on the motherboard.

(We don't have enough 10G cards and they aren't cheap enough that I can leave one permanently in my desktop.)

PS: My home machine can apparently use some specific add-on Thunderbolt 3 cards, such as this Asus one, but my work desktop is an AMD Ryzen based machine and they seem out of luck right now. Even the addon cards are not inexpensive.

posted by [syndicated profile] dg_weblog_feed at 03:22am on 15/02/2019

Posted by diamond geezer

Dear DG

Thank you for submitting your article "The Edible Bus Route".

Unfortunately we will not be able to use it on our platform as it is insufficiently on brand.

When we commissioned this article we assumed the title referred to bars and restaurants, and that its key content would include all the best eateries along the way. Instead, if we've got this right, you tell us that The Edible Bus Route references a handful of flower beds installed by community activists. Sorry, but this is interesting how?

The 322 bus route links many fantastic foodie destinations, including Clapham, Brixton and Crystal Palace. Many of our readers will have favourite pasta boltholes or tequila speakeasies in these locations. You appear to have ignored all of these in your write-up, instead focusing on locations that do not serve any food whatsoever.

You claim that Landor Road in Clapham is the site of London's first Edible Bus Stop. We checked and apparently it has been there since 2011, so although it was indeed pioneering it is alas old news. How clever of Mak and Catherine to have come up with the idea, and what a transformation, but there are far more Insta-friendly spots than this to see a bunch of crocuses.

Your photographs are poor. We understand you were unable to stand in the optimum location because the benches were occupied by local people swilling alcohol and energy drinks, but it is hard to enthuse over the surrounding raised beds at this scale. Also the whole point of The Edible Bus Route is that the 322 stops here, but you did not wait long enough to get a vehicle in shot.

Our online audience do not ride buses, so need a much better reason to grab an Uber to SW9 and take a look for themselves. The artisan croissants from the Old Post Office across the road look amazing, as one would expect from London’s Oldest Organic Bakery, but you have overlooked their wholesome rye sourdough in favour of a few herb boxes.

The second pertinent location on The Edible Bus Route appears to be almost three miles away, as the 322 travels. This is not the hit rate we expect from a genuine horticultural phenomenon. Sorry, where is this place you call Tulse Hill - have you made the name up? We are not aware of any other sightseeing locations in this distant suburb, and a few piles of soil have not changed our minds.

Again you are trying to pass off a 2012 project as cutting edge, and we fear that our competitor platforms would have featured it at the time (had they been interested). You claim that The Hoopla Garden is a must-see because of its bollards, but we understand that most of these were present previously so merely incorporated amid the planting. Your supposed arty photograph of one of the bollards has cropped off the final two letters and cannot be used.

Why on earth did you visit the site in February? No urban orchard will be abundant with fruit bushes and nut trees in winter, and the garden will not be a "haven for pollinators", as you put it, for a few more months. The edible aspect of your reportage is sorely lacking throughout, and you have focused too strongly on benches and the occasional daffodil.

Which brings us to the so-called Edible Bus Station. Of all the exciting things there are to do in Crystal Palace, why have you drawn this to our attention? The 322 passes the Dreamcatcher Imagination Hub and the Craft & Courage gin dispensary, not to mention Tamnag Thai, but you have chosen instead to write at length about four barely-visible overgrown triangular plots behind some railings. They do not compare.

If you are considering a rewrite, we suggest a post-Brexit angle. What would a truly Edible Bus Route look like? Might communities pull together to make a success of sudden food shortage? Is there scope to plough up roadside verges to grow vegetables and feed the nation? How much sustainable jam could urban blackberries provide?

As things stand, however, your description of three small cultivated spaces along a six mile bus route lacks any kind of engagement. The only Edible Bus Route we want to read about starts with brunch at Minnow, stops off for gentrified cuisine in Brixton's covered market and finishes off with a gelato from Four Hundred Rabbits.

Come back to us when you've learned to prioritise commercial opportunity over well-meaning sustainability.

With All Kind Wishes
yhlee: icosahedron (d20) (d20 (credit: bag_fu on LJ))
posted by [personal profile] yhlee at 06:47pm on 14/02/2019 under , ,
When I go on trips is when I usually get reading done. Yay Kindle?

recently or not so recently finished
Michael Cooper. Help! My Facebook Ads Suck!
This was recommended by [personal profile] helen_keeble. I have never run a FB ad and never intend to since I'm not really in self-pub [1], but I have friends who are in that business and I was curious about the terminology and methodologies involved. This is clearly written and really interesting, and I've heard people vouch for it, although I can't vouch for it myself from experience.

[1] I have a self-published collection of flash fairy tales but it just sort of sits there on Amazon and maaaaaaybe once in a while someone buys a copy. If you're going to self-pub flash fairy tales, go Patreon, not Amazon. :p

- JoAnneh Nagler. How to Be an Artist Without Losing Your Mind, Your Shirt, or Your Creative Compass.
I read this a while back and forgot to report on it (one of the problems with my Kindle is I keep losing it around the house). The gist of this is "don't quit your day job before you're bringing in enough income with your creative job." It's pretty pragmatic and I generally agree with it. There's really not much else to say about it.

in progress

- Anne McCaffrey. To Ride Pegasus.
This is a nostalgia reread for me, and this is one of those fix-up novels made of short stories in a sequence as far as I can tell--I think this and its sequels are precursors to The Rowan and Damia, etc. I really enjoy reading about the early era of psionics in this setting, although I have to *facepalm* some at parts of "A Womanly Talent." spoiler )

- Tim Harford. The Undercover Economist Strikes Back.
It would probably have made more sense to read the first in the series (?) first, but this one was on sale and the others weren't, so I picked it up. I'm about a third of the way through and really enjoying it. This volume is on macroeconomics and has given me the first explanation that made any sense as to why money works. (I tend to get stuck on the fact that money is a mass delusion and stutter to a halt.) Of course, this is me, so econ explanations don't stick in my head, but now I know where I can look.

- Penelope Bloom. His Banana.
This self-published romance has a killer blurb (formatting aside) but I'm only about 10% in and not sure yet whether the book itself is my kind of thing.

My new boss likes rules, but there's one nobody dares to break...
No touching his banana.
Seriously. The guy is like a potassium addict.
Of course, I touched it.
If you want to get technical, I actually put it in my mouth.
I chewed it up, too... I even swallowed.
I know. Bad, bad, girl.
Then I saw him, and believe it or not, choking on a guy's banana does not make the best first impression. [etc.]

- Paul Bloom. Against Empathy.
Bloom appears to have some kind of argument against emotional empathy (Where you feel what you think someone else is feeling) as opposed to cognitive empathy (cognitive ability to anticipate/predict other's emotional states). As an example of the kind of argument he makes, he points out that because empathy (emotional empathy) is innumerate, people will make knee-jerk judgments based on a single shocking case where statistically the other decision would be of benefit to more people. Color me extremely curious--I'm doubly curious because I have weak to nonexistent emotional empathy when dealing with people face to face. (I can't even figure out why you would want it. When Joe is really upset about something that's gone wrong, my getting upset as well is rarely if ever going to help me do something rationally useful about the situation? On the other hand, I know cognitively he's upset and wants X done for reassurance, so I can make whatever soothing noises are required, or whatever.) ANYWAY. I'm kind of skeptical but willing to read the book to find out. (Could have been useful Kujen research if I'd found the book earlier, ahahahaha.)
Mood:: 'busy' busy
posted by [syndicated profile] dg_weblog_feed at 12:01am on 15/02/2019

Posted by diamond geezer

The Count 2019 - half-time update (with approximate change since 2018)

Count 1) Number of visits to this blog: much the same
Count 2) Number of comments on this blog: down
Count 3) Number of words I wrote on this blog: up
Count 4) Number of hours I sleep: slightly up
Count 5) Number of nights I go out and am vaguely sociable: down
Count 6) Number of bottles of Becks I drink: nil
Count 7) Number of cups of tea I drink: same
Count 8) Number of trains I travel on: up
Count 9) Number of steps I walk: well up
Count 10) The Mystery Count: nil
February 14th, 2019

Posted by John Scalzi

And on Valentine’s Day, too! Awwwwwww.

I’m out because I feel like it but also because I have a project to finish. So, unplugging from the Internet to get done. As one sometimes has to do. See you all next week.

posted by [syndicated profile] johndcook_feed at 10:09pm on 14/02/2019

Posted by John

Yesterday I mentioned μRNG, a true random number generator (TRNG) that takes physical sources of randomness as input. These sources are independent but non-uniform. This post will present the entropy extractor μRNG uses to take non-uniform bits as input and produce uniform bits as output.

We will present Python code for playing with the entropy extractor. (μRNG is extremely efficient, but the Python code here is not; it’s just for illustration.) The code will show how to use the pyfinite library to do arithmetic over a finite field.

Entropy extractor

The μRNG generator starts with three bit streams—X, Y, and Z—each with at least 1/3 bit min-entropy per bit.

Min-entropy is Rényi entropy with α = ∞. For a Bernoulli random variable, that takes on two values, one with probability p and the other with probability 1-p, the min-entropy is

-log2 max(p, 1-p).

So requiring min-entropy of at least 1/3 means the two probabilities are less than 2-1/3 = 0.7937.

Take eight bits (one byte) at a time from XY, and Z, and interpret each byte as an element of the finite field with 28 elements. Then compute


in this field. The resulting stream of bits will be independent and uniformly distributed, or very nearly so.

Python implementation

We will need the bernoulli class for generating our input bit streams, and the pyfinite for doing finite field arithmetic on the bits.

    from scipy.stats import bernoulli
    from pyfinite import ffield

And we will need a couple bit manipulation functions.

    def bits_to_num(a):
        "Convert an array of bits to an integer."
        x = 0
        for i in range(len(a)):
            x += a[i]*2**i
        return x

    def bitCount(n):
        "Count how many bits are set to 1."
        count = 0
            n &= n - 1
            count += 1
        return count 

The following function generates random bytes using the entropy extractor. The input bit streams have p = 0.79, corresponding to min-entropy 0.34.

    def generate_byte():
        "Generate bytes using the entropy extractor."
        b = bernoulli(0.79)
        x = bits_to_num(b.rvs(8))
        y = bits_to_num(b.rvs(8))
        z = bits_to_num(b.rvs(8)) 

        F = ffield.FField(8)
        return F.Add(F.Multiply(x, y), z)

Note that 79% of the bits produced by the Bernoulli generator will be 1’s. But we can see that the output bytes are about half 1’s and half 0’s.

    s = 0
    N = 1000
    for _ in range(N):
        s += bitCount( generate_byte() )
    print( s/(8*N) )

This returned 0.50375 the first time I ran it and 0.49925 the second time.

For more details see the μRNG paper.

Related posts


kalypso: Kathleen Ferrier as Orfeo (Music)


2 3