IBM Model M – My New Old Toy

Sometimes all you need is a new toy. My new toy’s a really old toy, but it has the same effect as something shiny and new. Such a thing of beauty, such a satisfying experience. It’s funny because in its heyday, this was a run of the mill accessory for IBM computers, which is about as boring and mundane you can get.
IBM Model M serial number

Serial number and manufacturing date


This particular unit was manufactured in 1987. Ever since I heard about mechanical keyboards in 2012-13, I have wanted to have one of these in my possession. I’m glad to report that it lives up to all that hype.

The anticipation combined with the honeymoon period of using something new. The slight hiccup while setting it up. Browsing forums to find fixes to issues. Finally getting a thirty-year-old piece of electronics to work with a modern computer. It just scratches my itch for tinkering with things, making them work together. An itch I have had since the days before digitally downloadable games were the norm, and I had to buy bargain basement games due to a lack of availability where I lived. My computer was barely able to run those games, and I had to find all sorts of fixes for a multitude of issues. The satisfaction after getting a game to run for the first time, after hours and sometimes even days of troubleshooting is what the experience of getting this keyboard to work reminded me of.

How it feels to type on it

Now that the beige behemoth was up and running, I started to type. This anticipation to type is something I hadn’t felt in ages. It didn’t matter if I’m a bad writer. It didn’t matter if a genius level intellect told me I wasn’t good at it. None of that mattered. I wanted to do this again. I wanted to tear open a hole in the dark unconscious and pour the contents of the mind into the realm of the digital, through this ancient instrument. This instrument that flawlessly captures what it should feel like to put words on a screen. Just the right amount of force. Just the right level of click. Throw in the occasional ping in there. Hear those sounds, let your thoughts achieve a particular rhythm. Lose your conscious thoughts and inhibitions about your apparent skill or the lack of it. All of that is pointless. You write because you don’t want this to stop. You don’t want the clicking to stop. You want to keep typing through the slight fatigue, till the fingers start to feel the strain, your fingertips slightly hot from pushing down on the perfectly curved and textured plastic keys.

In a casual conversation with friends, I joked that this was my weapon against the transgressions of my upstairs neighbor who likes cranking up the volume when he watches TV and listens to music. It doesn’t matter to me now. I want to write. Write about something. Write about anything. My will to write was like a rock precariously perched on a cliff waiting for a little nudge to convert potential energy to kinetic. The nudge came in the form of a new keyboard. The rock is certainly rolling now. The content does not matter. This is purely self-indulgent writing. Writing for its own sake.

So familiar, yet so strange

This brings back memories of a time that I now feel was much simpler. The feeling is familiar, but the memory is distant. In some way, this keyboard feels just like that- an object from the distant past which feels so familiar as I use it today. I didn’t need to learn anything, I didn’t have to readjust. Everything was in the right place. Like going back to your parent’s house after several years to find your room just like it was before you left. That makes sense because the IBM Model M helped popularize the standard keyboard layout of the keyboards of today.

It doesn’t have the modern accouterments- no backlighting, no pass-through for headphones or USB devices, no macro keys. But the feel of the keys more than makes up for that. I’d gladly give all of those fancy features up for the visceral satisfaction this buckling spring keyboard gives me.

This keyboard was built for offices and workspaces. It was built at a time when the age of personal computing was only getting started. Back then, loud keyboards were more accepted. We’ve moved past the time when keyboards were designed to sound similar to typewriters. We live in a time where the most frequently used keyboards are virtual ones on a glass touchscreen panel. There’s haptic feedback of course, but it is nowhere close to what you can get from a keyboard with mechanical keys.

Some parting thoughts

The IBM Model M serves as a reminder of the things we’ve left behind in our relentless quest towards more affordable mass-produced computer peripherals. Mechanical keyboards are making a comeback as a niche community of late, but of the keyboards, I have tried, none come close to the IBM Model M. It’s so much more than just “good enough”.

This is, of course, my personal opinion. Not everyone likes clicky and loud keyboards, not everyone’s willing to put up with a hefty computer peripheral. But if you, like me, are filled with a strange compulsion to write, this is one of the most satisfying tools you can use.



PC Build Blog- 2016

A month ago I built a new desktop computer for myself. I have been a PC enthusiast for years now. I’ve written a series of blog posts about the basics of getting into PC gaming before, and when my laptop began showing its age in terms of gaming performance, I decided to put my money where my mouth is and build my own PC all by myself without any help.

Parts and Justification


Mandatory parts display “glamour shot”.

The first and in my opinion most interesting part, is finding out all the parts you need for your system. I decided that I wanted to build a no compromise 1080p gaming system. I got all my parts at micro center, a local electronics store. They had special labor day sale pricing at the time, and also matched prices with Newegg, which was really helpful, as I was able to buy all the parts that I needed, at an optimal price, without having to wait for them to arrive via mail. Another benefit was that I was able to check whether the CPU, motherboard and RAM could POST (Power On Self Test) at the store itself, and I could be sure that I didn’t have any components that were DOA. Here are the parts that I chose:

PCPartPicker part list / Price breakdown by merchant

Type Item Price
CPU Intel Core i5-6500 3.2GHz Quad-Core Processor $197.88 @ OutletPC
Motherboard Asus H170 PRO GAMING ATX LGA1151 Motherboard $117.98 @ Newegg
Memory EVGA SuperSC 16GB (2 x 8GB) DDR4-2400 Memory
Storage Samsung 250GB 2.5″ Solid State Drive $72.99 @ SuperBiiz
Storage Western Digital Caviar Blue 1TB 3.5″ 7200RPM Internal Hard Drive $49.49 @ OutletPC
Video Card EVGA GeForce GTX 1060 6GB 6GB SC GAMING Video Card $259.99 @ B&H
Case Corsair 200R ATX Mid Tower Case $44.99 @ Newegg
Power Supply EVGA SuperNOVA G2 550W 80+ Gold Certified Fully-Modular ATX Power Supply $82.98 @ Newegg
Operating System Microsoft Windows 10 Home OEM 64-bit $84.88 @ OutletPC
Monitor Samsung S24D300H 24.0″ 60Hz Monitor
Prices include shipping, taxes, rebates, and discounts
Total (before mail-in rebates) $956.18
Mail-in rebates -$45.00
Total $911.18

For my CPU, I decided to go with a non-overclockable variant of the i5. This allowed me to skip an aftermarket cooler for the build, and use the stock Intel cooler that came with the processor. I went for an H170 motherboard, as that kept costs down. A Z170 board would be overkill, as I’m not looking to overclock the system anyway. For storage I went with a 250GB SSD to store the Operating System (Windows 10 Professional) and some key programs, and a 1TB Hard Drive for the rest of my storage needs.

When it comes to gaming, the graphics card is one of the most important components which determines the overall gaming performance of your system. I had two choices for my intended 1080p goal- the Nvidia GTX 1060, or the Radeon RX 480. This was a tough decision because while the GTX 1060 beat the RX 480 in the benchmarks, the RX 480 has certain features that could lead to better gaming performance in the future. Features, like support for DX12, asynchronous compute and the Vulkan API that could lead to performance gains if developers took advantage of it. I decided to go with the former, because I’m most concerned with how the card performs in the present, and if I did choose the RX 480, the upgrade path would most probably be getting another one of those, and using them in Crossfire. Crossfire isn’t fully supported by all developers at the moment, and I’m not fully sold on that concept. I thought it would be better to go with a single card now, and swap it out for the single most powerful card I can get, in the future. Taking all of this into consideration, I went for the EVGA GTX 1060 SC edition, which is overclocked right out of the box and has 6GB of VRAM, has a small form factor, and was the most cost efficient variant of the 1060 that was available a the time.

For my power supply needs I went with a 550 W unit made by EVGA. For the case, I went with the Corsair 200R. I liked the minimalist look of the case, the great build quality and the front I/O that includes two USB 3.0 ports. These two parts in particular were something that addressed a greivance that I’ve had with pre-built PCs or ones assembled by third parties. They always skimped out on the power supply and case to keep costs down, going for substandard no-name components. The power supply is a key component, and getting a cheap power supply can lead to performance issues at best, and may cause damage to your system at worst. Finally, the monitor is a 24 inch 1920X1080 resolution Samsung monitor. Other peripherals included a Kailh blue switch mechanical keyboard, a wrist pad, a Steelseries Rival 100 gaming mouse, and a large desk mat.

Assembling the PC

Once I had all the parts, it was time to put them all together. This was the part I was most apprehensive about at first. Assembling a computer seemed like it would be a difficult and time consuming task that requires a lot of specialized tools. That couldn’t have been farther from the truth. All I needed was a Phillips #2 head screwdriver, and some patience. I watched plenty of PC build video guides on YouTube to get an idea of best practices and some tips. I would recommend watching Carey Holzman’s videos, as he goes into a lot of detail and answers a lot of questions that others generally take for granted.

Once I verified that the key components did POST, I went about preparing the case. All the screws and bits required to assemble the PC were given inside the case itself, and the hard drives and SSDs didn’t need any tooling to be mounted on to the case. Mounting the drives onto the case was as simple as sliding them into the mounting points until a “click” sound was heard, that signified that they were in place. I fastened them onto the case using the provided screws just for good measure.

Installing the power supply was quite straightforward. I simply had to make sure the power supply was oriented correctly and then had to mount it on to the case.


Possibly the most time consuming task was to install the motherboard on to the case. After I had installed the CPU with its fan, and the RAM on the motherboard, I had to put the motherboard’s IO shield in to the case. As it is a friction lock and not mounted by screws, it gets tricky at times. An improperly mounted IO shield could lead to certain ports not being properly accessible. Once the IO shield is mounted, you have to make sure that the holes on the motherboard align with the mounting points and standoffs on the case. Once they line up, you also have to make sure the ports on the motherboard line up with the IO shield.

I had to reinstall the IO shield a couple of times before it fit properly, but once it was properly installed, the rest of the process went quite smoothly. After aligning the motherboard, it was just a matter of using the proper screws to mount the motherboard on to the case.


Installing the graphics card into the PCI slot was also quite simple.


Wiring and wire management is another key part of the building process. Although it is easy to route all the wires through the closest routing holes is easy, at times the easiest wire routing option was not the best one in terms of wire management. Thankfully the case provided ample routing holes to route the wires through, which helped spread out the wires rather than causing a giant mess. Wiring the case components required me to refer to the manual. Thankfully, all the wires were labeled. The wires to and from the power supply were also labeled and not reversible, so finding the right wires and the right orientation for the wires was quite simple. For the drives, the angled SATA cables were a godsend.

Installing the Operating System and Drivers

When I installed all the components and went into the UEFI/BIOS for the first time, I couldn’t find the USB drive as an option under boot devices. After messing around with all the options it was a hard restart that did the trick. Once the USB drive was detected, the Windows install went quite smoothly. I ran into another hiccup when I realized that I had to install all the drivers. I managed to use my laptop to download all the required drivers which included the drivers for the Ethernet and the graphics card, which I installed on to the PC using a flash drive.

To install the rest of the basic software, I used Ninite. It’s a great tool that lets you select the software you want, and creates a custom installer that installs all the software you want in one go. I decided to go with LibreOffice for my office suite, and Foxit Reader for reading PDFs. No flash player, no Java, no Adobe Reader.

I then tweaked my preferences for windows which included removing all the hideous app advertisements on the start menu. This video proved to be a great reference.

Final Thoughts

Building my own PC turned out to be a very fun experience, and when I was done assembling and configuring it, I was left wanting to do it all over again. I learned a lot, right from researching for parts, to troubleshooting while building the PC and installing the components. In many ways, the actual process of connecting the physical components was like building Lego- all the pieces were labeled and they fit together precisely. Some things like wire management required some thought. Thankfully, it was not too much of a hassle and the side panel didn’t require any excessive force to shut.

Building my own PC has always been a desire of mine for a long time, and I am glad I was able to put aside my apprehension and build one for myself, all by myself.

Routines and the Quantified Self

What is the “Quantified Self” ?

These days some of us or quite a few of us try to capture certain minute details about our daily lives in a digital format. We keep a track of the amount of steps we have taken, the amount of calories in the meals of the day, and so on. The aim here is to keep a track of these things so that we may reflect, analyze and learn about what is going on with ourselves, to eventually improve ourselves over time. This has become much easier due to smart devices and wearable technology. Each and every one of us is generating tremendous amounts of data about ourselves every single day. Systems like the Nike Fuel band, the FitBit and even Apple and Google’s fitness oriented application suites want to take advantage of this current trend.

At the heart of this new “Quantified Self” movement are tiny, inconspicuous sensors embedded in various devices, that help record and log surprisingly accurate and incredibly detailed information. These sensors, combined with ubiquitous computing that allows these numbers to be crunched and presented to the users in an easy to understand format, and social networks that allow the users to share and collaborate, form the core of the new “revolution” in health and wellness oriented experiences.

Although all of this is a great example of how the latest technology can be used for our benefit, the idea of the Quantified Self is not as completely new as one might think. We have been keeping track of ourselves in various ways long before the advent of miniaturized biometric sensors and portable smart devices. Certain things like keeping a track of spending, or stepping on a scale every morning, have been a part of our lives for quite a while now. What’s new is this increased need for self-knowledge, helped by the rich and detailed information that can be recorded about ourselves.

Of course, there are still a few issues with the whole Quantified Self movement. One of them is keeping the user engaged. These systems currently require the user to constantly monitor or observe the information daily or over time. This may lead to information overload, or confusing the user because of too much information. Another is keeping the user motivated and interested in the system. It is observed that after a while a lot of people tend to revert back to their old ways because they get bored or lose motivation, and their fitness trackers end up in a desk drawer.


One of the things I realized as I read and researched about human factors, is the importance of routines in our daily lives. Certain things we do, certain actions that we perform, are so familiar to us that we do not spend too many attentional resources to complete those actions. They become “routines”. We continue to follow those routines until something unusual happens.

To understand how we can make the above mentioned Quantified Self systems better, we need to understand how to design them better. That’s where the understanding of routines comes into the picture. If the systems become a part of our routine, completely non-intrusive without too many requirements on our attention, they might just become better experiences.

Today’s solutions

Designers have tried to work around the issue of keeping users motivated in the case of fitness tracking. Gamification, or adding game-like interactive elements such as competition with others in your social network, trophies or achievements for achieving goals, or Role-Playing Game like elements such as character creation and progression, have all been tried out. The problem here is that it lacks a universal appeal to people. Some people really like Gamification, and others can’t be bothered with it.

Other attempts at helping users maintain motivation have been actual monetary incentives, such as the “Pact” app that allows you to bet money on whether or not someone will complete their fitness goals, or the “PavLok”, a wearable device named after the Pavlov experiment, which literally gives the wearer an electric shock if he/she does not complete the pre-decided goal.

I believe that the solution lies in understanding how routines are created, maintained and modified. Creating a new routine or modifying an existing one is difficult compared to maintaining an existing one, because changing certain habits takes conscious effort and attention. It takes a few cycles of the routine to fully internalize the changes. If it is too difficult, the individual may revert back to old habits. Superficial motivation like Gamification may not provide enough incentive to the user, to completely change their routine.

What I feel would be the ideal experience:

One of the key aspects of the quantified self is the focus on the individual. Self improvement, and detailed information that is specific to the individual are the key points of this whole experience. Using pre-set goals like “10,000 steps a day” thus seems counter-intuitive to this point. If every person is different, then every person should have goals as per their requirement, or their capacity. That is where biometric sensors fall short, and human intervention provides a more suitable solution. Sometimes it’s better to jog or run until you can feel your legs tiring out, for example, rather than just stopping after 10,000 steps every time.

That is where I feel this system needs to improve not only simply recording detailed information, but also to help create routines, and help you find your own way of making the best use of the sensor data. Information that can help you improve upon your fitness by showing you how much you can do, and what you should do to push your limits. The user would know when they have done enough, when they can feel it in their own bodies, without the need of a 3D avatar of themselves telling them they did a good job.

Smartphone Rant- the Hype Disappointment Cycle Continues

The annual hype-disappointment routine that is the smartphone release cycle continues to churn, and since the last time I talked about that, things are pretty much the same. Phones, smartphones, phablets, ginormo-screen pocket bursting behemoths roam the landscape unchallenged. Kind of like dinosaurs, in that way. Way too big, and will probably continue to rule the landscape until there’s come major cataclysm. In that respect, I sincerely hope we’re living in the cretaceous period of the dino-phones.

Why the sudden hate towards screen size accretion? Well, I don’t explicitly hate the things, I’m just increasingly annoyed by them. Hundreds of tech blogs have opined about the rise in popularity of these five point something plus inch screen devices, their perceived usefulness, their actual usefulness, phrases like one handed usability and screen real estate are being thrown around like stale confetti. These devices are popular, people buy them, other people wonder why, and some write about it too. With long standing opposers of this trend (apple) now diving headlong into this whirlpool, the soft chorus of voices that ask when we will see the end of this phase seems to be getting softer.

After looking into why people seem to like large screen phones, I can only find one proper answer. Apparently, there are lot of people out there who want just one device, rather than multiple ones. Phablets seem to satisfy this need. But do they really? Yes, watching movies, and consuming media in general seems to be better suited to them, but apart from that, there really doesn’t seem to be any added value here, at least to me. No, Samsung’s “Multi-Window” and the other implementations of multi-tasking are NOT useful. First of all, the number of apps that support this functionality is limited, and secondly it isn’t really the most fluid experience one can have. I’d personally rather switch between apps rather than try to fit multiple windows on a screen and try to work as they glitch out, jump around and basically don’t work to the level of desktop applications.

Speaking of one handed usability, these phones sometimes come with accessibility features that help users use their phablets with one hand easily. So, they need help to use their device, with the help of a special feature. When you need a special way to be able to use your device despite its size, you’re not having a very good user experience, in my opinion. “Reachability mode” and all these software tricks are stop-gap solutions at best. I cringe when I see the unused space that’s left when these accessibility modes are used. It looks ridiculous, and the whole “bigger screen adds more value” argument seems to fall apart.

All these gripes with the current state of mobile phone screen sizes intensified when I began to think of what my next phone should be. My phone has served me well for the past three years, and despite having a good developer support in terms of custom ROMs, it’s on its last legs. What phone do I get? One of my best bets is probably the OnePlus One. Specifications, price, it’s got it all, hasn’t it?  Except it doesn’t have it all. Whenever I look at the device, and when I saw it in the flesh, I simply couldn’t shake off the feeling of it being a “high quality prototype”. It is version 1.0 of a design, a platform, whatever you’d want to call it, and the subsequent versions will be better than this one, or at least they should be. Now, what does that make the users of the current device? Glorified beta testers.

How about the Z3? It looks and feels premium too doesn’t it? Oh, and why not the Z3 compact? It’s the perfect device for all my qualms! Just one thing- the UI is uninspired, and if I try to rectify that by rooting and flashing custom ROMs, well, that would mean the camera experience would take a hit. I won’t be able to use certain features of the camera due to DRM. And I’m not going to stay on the stock UI, especially when Sony tries so hard to sell me their stuff at every given opportunity when I use it.

How about the Galaxy S5? Although I don’t like to admit it, that is probably one of my safest bets right now. I can get rid of TouchWiz, and there are quite a few custom ROMs to choose from. Maybe. How about HTC devices? The latest iteration of the sense UI is pretty good, right? Well, I guess I can’t argue with that either.

There are many, many devices and OEMs I haven’t listed here, and they all feature in my thought process in some way or the other. Three’s just one thing. Have you ever bought a new thing that didn’t feel new? That’s the most important thing that’s holding me back. No matter what I end up getting, it won’t replicate the experience that I had the last time, the wonder, the amazement, the delight. It’s just the same things with new coats of paint, and more unwieldy than the previous thing.

PC GAMING 101 Part 7: Do-It-Yourself versus Pre-Built – Build or Buy?

When it comes to PCs, you can either build one yourself, or get a pre-built PC from a manufacturer, or a specialized PC assembling “boutique”. Let’s have a look at all the options available, and the pros and cons of each of them:

Pre-Built PCs

Pre Built PCs are the ones that manufacturers like Dell or Acer make. If you’re inexperienced and don’t know how to build your own PC, and do not want to take any risks, a pre-built PC is what you should look into.

Advantages of a Pre-Built PC:


1. An all encompassing warranty

A preassembled workstation from a company will have a warranty that covers all parts. That means if your computer fails, the company will work with you until the offending part is found. Individual components always come with a warranty, but some people just do not want the additional hassle involved in diagnosing the problem and dealing with it.

2. Simplicity and Support

Some people are not tech savvy and simply want their system to work right away, with little or no setup time. If something doesn’t work, they want someone they can call for help, like customer service.

Boutique System Builders


This is the option that lies in between a Pre-Built and a DIY system, and is for the kinds of people that want a higher level of customization on their system, like water cooling or hot-swap capabilities that big system vendors generally don’t provide. They provide a higher level of customization and you have more of a say in what components go into your system. You can choose this option if you want a higher level of customization, but can’t be bothered with building the system by yourself. Do remember, these boutique builders do have their profit margins.

Another option is buying components from vendors, either online or from stores, and having them assemble the PC for you at an additional cost. The difference between this and a boutique is that you need to know exactly which components you want, and you know exactly how much the assembly costs. Remember to factor in costs of logistics and getting the assembled PC shipped/delivered to your place once it’s done.


Do It Yourself


Ever since the early days, users have had the option of assembling their own PCs. Building a PC yourself has its share of advantages and disadvantages, and although there are many people out there who prefer building their own PCs so that they can customize the specs according to their requirements, you really need to know what you’re doing.


Advantages of Building a PC:


1. It’s Cheaper to Build

If you do things on your own, you will certainly cut down the cost of middleman, which in turn will help you save money on assembling as well as testing. The more powerful your intended desktop computer will be, the more likely you will be able to save money by building your own. This becomes significant when you consider higher-end PCs or Workstations, as Manufacturers or Boutiques will have a considerable markup.

2.  You get exactly what you want

Pre-Built PCs come in a pre-determined configuration, which is because the manufacturer selects it based on what’s the easiest to assemble on a large scale. This means that you either pay for things that you don’t want, or you don’t get the things that you want despite paying for it. Also, there’s no guarantee that the components used in all the machines are the same.  The manufacturer may switch suppliers due to availability, costs, etc which means that two of the exact same models of computers can have very different parts.

3. No Bloat Ware 

Computer manufacturers often install software on their machines in an effort to differentiate themselves from their competition. What really happens is that there is extra junk on your desktop that you can ignore, deal with, or uninstall. This takes time and effort. When you create your own machine, the only software installed is what you install.

4. Upgrade as and when you want

When it comes to upgrading your PC, if you’ve built it yourself it means you know which part or parts to swap out for new ones, and how to do it.

5. Experience

Building a computer gives you a lot of experience. The physical putting-together-of-everything phase, while also educational, doesn’t compare to the research you’ll do when building a computer. If you care about what’s going into your CPU, you’ll learn all the terminology and what does what in a computer. It’s pretty useful. And of course, the actual building is fun too. And even if you fry your motherboard, you’ll get to learn what NOT to do afterwards!


Disadvantages of Building a PC:


1. It’s more difficult

There is of course a fair share of difficulty involved in building your own computer. You may have to face your share of challenges, especially if you are not familiar with setting up computers. Picking out the parts to build a computer system from can be an extremely frustrating process. This is particularly true if you are not familiar with the technology and are building your first computer.

2. No All-Encompassing Warranty

All computer parts have the risk of failing. It doesn’t matter which company made them or which company installed them. Parts will fail. A preassembled workstation from a company will have a warranty that covers all parts. That means if your computer fails, the company will work with you until the offending part is found. Individual hardware vendors will not work with your computer as a whole unit.

3. Incompatibility Issues

You have to worry about sizes, compatible components, wattages, etc. If you don’t research things properly, you could end up with parts that don’t work well together or maybe won’t even fit into the case that you have selected.

The Bottom-line

It all depends what the computer is for. Usually, if you are spending less than Rs. 50,000 on a computer, or just want a simple desktop system, then I recommend a prebuilt, simply because you get a copy of windows already packaged with it and also the hassle of building it yourself if you are a first time builder may not be worth the slightly better overall quality of the components. Manufacturers are able to get discounts because they buy things in bulk. In addition to this, the budget market is extremely competitive which means it is often cheaper to buy a basic computer for just browsing the web and doing productivity software than it is to build one yourself.

However, when it comes to building a High-end system, a workstation or a gaming PC, building one yourself is the way to go.  All it takes is research and the willingness to put the things together, and it offers immense satisfaction and also experience and know-how. You can build one tailored to suit your exact needs, right down to the aesthetics.

It comes down to what you need, how much it will cost, and if you are willing to put in the time. If you are willing, then you can get exactly what you need and potentially save money in the long term. But don’t overlook the potential hassle and time you might have to put into building it.

In the next part of PC Gaming 101, I’ll talk about some valuable resources that you should use while researching and building your PC.


My views on Google's new Material Design UI

Google introduced a UI refresh as a part of the Android L developer preview at their recently concluded developer conference, Google I/O. A lot is being said about the new design language labeled “Material Design” and Google has provided extensive guidelines to help developers design their apps in this way, moving forward. A very important aspect of this design is unity, as Google’s VP of design Matias Duarte says: 

We wanted one consistent vision for mobile, desktop and beyond, something clear and simple that people would intuitively understand.

Unity is important for Google as it will make it easier for users to access Google services through different devices. Surely, Google has taken design cues from both Microsoft and Apple in its material design, but it does not look like a patchwork of disjointed ideas- it seems very cohesive, and thoughtful.

It’s all about “Paper Craft”

Paper is the fundamental design paradigm of material design. Every pixel drawn by an application resides on a sheet of paper. A typical layout is composed of multiple sheets of paper. 

Toolbars and menus can be configured to look and feel like papers on a notepad.

Toolbars and menus can be configured to look and feel like papers on a notepad.

Depth as Hierarchy, not Ornamentation

In previous versions of Android and iOS an excessive amount of textures, gradients and shading was used which appeared overdone, disjointed and ugly. IOS 7 saw a radical change towards taking away all these superfluous graphics giving rise to a “flat” UI paradigm without any gradients, shading, etc. 

Instead of going to extremes as is the case with iOS, Google has adopted a more subtle and nuanced approach. Material Design uses depth not as ornamentation, but as a way of communicating hierarchy and as a way to focus users’ attention to a task. Shadows can be added to aid the perception of depth and to highlight objects. 

While the “Flat UI” paradigm is all about taking things away (gradients, shadows, highlights, etc), this new philosophy seems to be based on adding movement, animation and colors to spruce up the user experience. 

Responses to Input

Until now, precious little was done in terms of providing users some positive feedback while interacting with the system/application. Material design incorporates visual and motion cues in an attempt to engage the user, providing input acknowledgement through animated effects that look quite refined, and not overdone.

Upon receiving an input , the system provides an instantaneous visual confirmation at the point of contact.

Use of Color

Android's Gmail app, before and after the new Material Design interface.

Android’s Gmail app, before and after the new Material Design interface.


Taking a leaf out of the Windows Phone UI playbook, Material Design seems to have a distinct focus on typography. The Roboto font, a mainstay on android devices ever since android 4.0 ICS, is modified slightly; it is wider and rounder in an an attempt to be more pleasing to the eye, especially since text is almost always white juxtaposed against a vibrant background in the main title bar of applications. 

Simplified Icons

The trend of moving towards more simplistic icons instead of gaudy texture rich ones is pretty evident ever since android ICS and can also be seen in custom OEM skins like HTC Sense 6. 

Each icon is now reduced to a minimal form, every idea edited to its essence. Even the navigation buttons have been reduced to geometric shapes. The designs ensure readability and clarity even at small sizes. Every icon uses geometric shapes, and a play on symmetry and consistency gives each icon a unique quality. Emphasis is laid upon consistency of icons for both mobile and desktop icons, and small details like rounded/sharp corners have been touched upon.

Focus on Imagery

imagery-focusThe focus on visual content is also very obvious on observing the new Android L design. The image takes center stage, and designers are encouraged to use vibrant and bright imagery without using stock photos. The focus on vibrancy of images has always been a part of the smartphone user experience, users prefer oversaturated images and vibrant colors in the photographs they take, they like colors to “pop” rather than look natural. The popularity of AMOLED display technology and display calibration by OEMs that favors the oversaturated over the true to life colors supports this observation. 

Just like the Windows Phone UI, Material Design relies on images that go right up to the edges of the containing area without any window borders. It’s all big, bold squares/rectangles rather than icons and windows. 

The “Card” Concept extended


Google has been shifting to the “card” user interface, a rectangle or tile that contains a unique set of related information. Cards are typically an entry point to more complex and detailed information. These cards or tiles have been a part of the UI in Google Now and a host of other applications like Google+. The way that these tiles update the user with live information is similar to Microsoft’s live tiles in the Windows Phone UI- for instance, showing the details for your next appointment on the calendar tile. Cards provide the user with summarized and glance able information and will be used extensively in the future as the focus on wearable technology increases. 

Moving Towards Consistency

Google’s new design language is a good refresh, and brings a lot of good things to the table in terms of design. However, one of the most important aspects of Material Design is the depth and detail of the documentation and its systematic nature. After a long era of designers and developers creating Android experiences that often feel renegade or pieced together, Google have undoubtedly stepped up their efforts to standardize and improve the UI and UX across their app ecosystem. If it’s adopted, it’ll certainly lend a much-needed consistency to that world. 

Keeps up with current design trends

Google is trying to incorporate uniformity by  trying to get ahead of all of the screen sizes they have going now and provide some real structure. It seems they really tried to set up a fail proof way to design around all of the screen sizes, from the desktop experience to Glass to the watch. The effort is extremely expressive and is obviously about controlling the experience. Instead of trying to impose a strict visual aesthetic, Google defined a set of principles that leave more freedom to individual designers, while still pushing their numerous apps in the same consistent direction. 

In Conclusion…

Many will see Material as a further extension into a flat era of design, in the same way Windows 8 and iOS 7 use large areas of solid color and wide open spaces with a focus on typography. I think it’s more than that – the current design trends are the only sane way to support a wide range of display sizes, ratios, and pixel densities. Physics, animation, and some of the layering effects are only now possible because the hardware allows it to be. The new design has elements that dynamically shrink and expand, adds more white space between elements, offers lots of animation, and provides a more 3D look emphasized by shadows and lighting effects. It’s designed to put the emphasis on the most important content of a screen. Although these are just visual effects today, they could be handy in future years with 3D displays and the possibility of tactile touch screens that actually raise portions of a display. 

Maybe this is Google’s way of filling the void left by the demise of richly textured skeuomorphic designs? In any case, we can only hope it will add a little warmth and humanity to digital design and save us from a world where every app looks and behaves the same. Overall? I like it, I’m glad it’s here, but I don’t find myself bowled over by any of the components of the new system. It’s a well-considered stride in a necessary direction. I see this a great effort forward in laying the groundwork for a very Google-driven future ecosystem.

The video below reveals how the Material design language works across all devices Google touches, from smartphones to Glass to wearables.

PC Gaming 101: Part 6: Cases

PC gaming is quite big in India. As games become more intense and compelling, gamers find themselves wanting the latest and greatest hardware to run these games smoothly. That being said, the majority of gamers wanting to build or upgrade their machines don’t have much of a clue, and are often at the mercy of vendors and salesmen, due to which, more often than not, they end up making the wrong decisions. This is an attempt to address this lack of information, and help all PC gamers make the best of their resources. This is PC GAMING 101.

Getting the right case or cabinet for your gaming PC is an unnecessarily complicated thing to do. The way computer cases are classified has changed over time. Once upon a time, it depended upon how many 5.25 inch bays the case had. Then, the classification was based on the overall height of the case. These days, those classifications are like guidelines as opposed to actual standards. The traditional size categories are shown below: 

Source: Tom's Hardware

Source: Tom’s Hardware

Full Towers are like the SUVs of the computer case world. They can have 5 or more 5.25 inch external drive bays, they range in height from 22 to 27 inches and they always support full size ATX, almost always support EATX, and at times even the not-so-standard XL-ATX as well. The funny thing about full towers is that apart from accepting more drive expansion, providing better cooling for hot running, inefficient setups like running 4-way graphics configurations, and having extra space for superfluous stuff like custom liquid cooling loops, they don't bring much to the table in terms of performance over Mid Tower cases. But they do tend to be easier to work with if you've got big hands, and the top bays are easier to reach when the case is sitting on your floor. All in all, Full Tower cases are more like luxury items than must-haves.

Mid Towers are the most common cases for custom builders, have 3 to 4 external expansion bays, stand 17 to 21 inches tall, and almost always hold a full size ATX motherboard. But, they don't have a lot of extra space for drives and what not. Expect to find 6 to 8 Hard Drive mounts in a typical Mid Tower, and enough cooling and space to comfortably handle 2 graphics cards in crossfire/SLI.  

Computer Case form factors- Full Tower (Extreme right) Small Form Factor (Center), Horizontal Desktop (Bottom Center), Mid Tower (Extreme Left)

Computer Case form factors- Full Tower (Extreme right) Small Form Factor (Center), Horizontal Desktop (Bottom Center), Mid Tower (Extreme Left)

Mini Towers are a great compromise between size and expansion. They have 1 or 2 external bays, stand 14 to 16 inches tall, and can host an mATX motherboard usually. They are nearly as versatile as mid-towers in applications ranging from office workhorses to high-end liquid-cooled SLI-powered gaming monsters because of their less-imposing profile and easier transportability. Most Mini Towers are suitable for use with only a single graphics card with adequate cooling while some may be okay for two. 

Anything taller than 27 inches is called a Super/Ultra Tower , and a case whose size can be modified by stacking components on top of each other, for, say, cooling options or drive mounting, is called a Mod Tower. Desktop or Horizontal desktop cases are not exactly towers, they are slightly different, and they used to be the dominant case size once upon a time but now they are more of a niche, and they come in various sizes, from tiny ones that are so small they need an external power brick, to huge ones that can hold server class motherboards and large RAID arrays of hard drives. 

Small Form Factor or SFF can come in almost any shape, from cubes to equal sided desktops to normal towers, but the one thing that they generally have in common is the support for a mini ITX motherboard max, with minimal drive mounting options and only sometimes support a graphics card, only one of them max. Cube Cases are called so, because of their roughly cubical size, and are available in a wide variety of configurations. 

There's other stuff out there, but these are the main form factors available today. 

Prev>> Part 5: Gaming Monitors

PC Gaming 101: Part 5: Gaming Monitor Buyer's guide

PC gaming is quite big in India. As games become more intense and compelling, gamers find themselves wanting the latest and greatest hardware to run these games smoothly. That being said, the majority of gamers wanting to build or upgrade their machines don’t have much of a clue, and are often at the mercy of vendors and salesmen, due to which, more often than not, they end up making the wrong decisions. This is an attempt to address this lack of information, and help all PC gamers make the best of their resources. This is PC GAMING 101.


If you own and regularly use a PC, you know what a minitor is. However, when it comes to gaming, not all monitors are built equally. So, what makes a monitor "good for gaming" ? (Well for starters it should connect to a device that runs videogames.) Let's have a look at the things you should look out for, while choosing a monitor for your gaming setup: 

1. Inputs 


Most gaming monitors these days have DisplayPort, HDMI and DVI input ports, or a combination of the three. (You can read more about display technologies and standards in Part 4, here). If you're gaming on a PC, and you want to keep things as simple as possible, you should go with DVI and DisplayPort with confidence. HDMI will work fine, unless you want the resolution to be higher than 1080P, or a refresh rate over 60Hz. HDMI 2.0 is coming out soon to address these issues. Not that HDMI inputs are totally useless though, you can use them to connect secondary gaming devices such as gaming consoles and switch between your devices as you choose. 

2. Size Matters 

Yes, a monitor's size does matter, but not for the reasons most people think it does. A larger monitor just puts a larger image in front of you, and isn't any more difficult for your graphics card to power. So you should pick a size that's comfortable for you, for the distance you want to sit from it. The spec that determines how hard it is to power the monitor, is the resolution. A 24 inch 4K monitor will be about 4 times more difficult to drive than even an 80 inch 1080P "Full HD" TV, because of the sheer number of pixels. Higher resolution monitors deliver a clearer, more "retina-like" display so resolution isn't a problem, in and of itself. It's just a factor you need to consider in your overall build/upgrade budget. 

Now that we've gone through the basics of monitors and displays, let's look at what makes a monitor "good for gaming". 

Response Time  

The rendering process of pixels on an LCD/LED display is very different from the old, tube style CRT monitors, and when the image updates, the pixels gradually shift from one colour to another. So, the slower the pixels of the monitor, the more "motion blur" or ugly streaking that you'll see behind moving objects on the screen. 

So, while buying a monitor for gaming, look for a monitor with a "Grey to Grey" response time of

8-16 milliseconds for  casual use

1-2 ms for competitive use.  

Refresh Rate 

60Hz versus 144 Hz

Expressed in Hertz, the refresh rate is the number of times an image is sent to the display, every second. If your eyes are getting more updates per second, you're getting information slightly faster than your opponent. It's a definite advantage, and the fastest monitors these days can run at upto 144 Hz, at 1080P. That means you can get screen updates upto 10 milliseconds faster than your opponent using a 60 Hz display. 

Input Lag

Now, this is a spec that most manufacturerd don't report, but is really quite important. When the CPU sends signals to the monitor, the monotor needs to translate that information into a format that the panel can understand. This processing introduces a delay which means that you could be seeing an individial frame that is anywhere from a few milliseconds later than it was output by your graphics card, all the way upto 50 milliseconds later, or more. For competitive use, look for a monitor that has an input lag of less than 10 milliseconds. But don't just take the manufacturer's word for it, LCD manufacturers are notorious for inventing completely new specifications to suit their marketing purposes. So, be sure to check out sites like Blur Busters to get the latest info and specs on gaming displays. 

Other Features

Apart from the factirs mentioned above, there are other factors to look out for as well, such as 

Now, if this guide raised more questions than it answered, or you'd just like to go hands-on and choose which specs matter for you, just check out online forums, they might really help out. 

Prev>> Part 4: Display Technologies

Next>>Part 6: Computer Cases


Mechanical Keyboards: Worth the Hype?

To most of us keyboards are all the same, just rows of keys, numbers and symbols that allow us to type on a computer. Keyboards are just some cheap plastic peripherals that are quite common and apart from the different branding, they’re all the same to most of us. For those who type on a regular basis as a part of their profession however, such an oversight can be harmful.  Not paying attention to the choice of keyboard puts the user at the risk of repetitive strain injuries or even carpal tunnel syndrome.

These days, there’s an interesting new trend among PC enthusiasts- Mechanical Keyboards. They’re different from standard keyboards, and some people claim that they help you type more accurately and even last longer than their normal counterparts.  Let’s take a look at this new trend, and help you decide whether you should make the switch from a normal keyboard to a mechanical one.

What’s a Mechanical Keyboard?

Switches in action

Mechanical keyboards use switches to register user input

A mechanical keyboard uses physical switches to determine when the user has pushed a key. Press a key, and the switch is pushed downwards, and that sends a signal to the PC telling it that a key has been pressed.

What’s so remarkable about that?

At a normal level, this seems like any other keyboard- you push a button, and the corresponding character appears on the screen. Think about it for a moment though, and you’ll notice that for the character to appear on the screen, you have to push the key as far downwards as it can go. That means you need to apply quite a bit of pressure on every key just to register an input. Imagine a writer, or a programmer who has to type continuously for several hours in a day. Typing for hours at a stretch can cause fatigue, and cases of computer related injuries are quite prevalent these days.

Just a normal keyboard

Standard keyboards use different membranes. Inexpensive, but they tend to cause fatigue.

The reason behind this is that most keyboards these days are composed of a set of three plastic membranes, with rubber dome-shaped switches underneath each key. Press a key, and the rubber switch pushes through a hole in the middle membrane to connect the top and bottom membranes, which creates an electrical circuit that causes the keyboard to send the input to your PC. This keyboard design is inexpensive and spill-resistant, but it doesn’t give you as much tactile or audible feedback when you press a key, and you have to press each individual key harder, which affects typing and causes fatigue much faster.

How exactly are Mechanical Keyboards different?


1. They feel different

When you swap your normal keyboard for a mechanical one, the first thing you’ll notice is that every key, when pressed, gives a clicking sound and a tactile response, and you don’t have to press the keys as hard as you’d have to when typing on a normal keyboard. This is one of the most important differences. Each keystroke requires much lesser pressure, and it gives you that reassuring click and feedback to tell you that the key has been pressed properly.

2. They are much louder

Mechanical keyboards tend to produce much more noise than normal keyboards, and how loud they can get depends upon the type of switches used in the keyboards as well as the typing style of the user. This may be an issue at a workplace, where co workers may hear the sound of your typing. Many people also say that it tends to drown out peoples’ voices in video calls. This is highly subjective, but an important thing to remember.

3. They are much heavier and bulkier

Mechanical keyboards tend to be much heavier and bulkier than their standard counterparts. This is good, because the added bulk and weight means that they keyboard will not slide around on a table and will stay put. But, it does affect portability, and you have to consider this factor if you constantly move your computer setup from place to place.

4. They are Expensive

All this mechanical goodness comes at a premium, though. Mechanical keyboards are much more expensive than their normal counterparts, and significantly so. Mechanical keyboards from big name brands can retail for as much as INR 5500 and even higher. In India, however, TVS manufactures a mechanical keyboard which is worth INR 1500. Despite the price difference, it’s still much more than one would pay for a standard wired or wireless keyboard.

5. They are sturdy and last longer

Mechanical keyboards, owing to their construction, tend to last much longer than regular membrane based keyboards. While most regular keyboards are rated for 5 million keystrokes, they generally last for about 2 or 3 years before needing a replacement. Mechanical keyboards on the other hand, are generally rated for 50 million keystrokes, which means that these things easily last for at least a decade. There are numerous instances of people using mechanical keyboards for more than a decade without any issues or failures. The use of switches as the input method greatly improves the longevity of mechanical keyboards and thus they offer a great value for money. 

How Mechanical Keyboards affect typing:

The longer you use mechanical keyboards, the more apparent the changes in your typing style will be. Mechanical keyboard users tend to use much lesser pressure while typing, and that leads to much lesser fatigue, and a significantly faster typing speed. This may not be a big deal for light users, but for professionals like writers, bloggers or programmers, this can prove to be significant as typing requires lesser energy, fatigue doesn’t set in as soon as it would on a normal keyboard and more typing gets done in a shorter time frame.

Should you get one? 

So finally, whether a mechanical keyboard is worth the added expense or not, is up to you. If you’re a person that types continuously for hours on end, or are into gaming, these keyboards offer a significant advantage over the standard keyboards. One thing is certain, though: once you begin using a mechanical keyboard, there is no way you’ll feel like using a standard keyboard ever again! 

What are your thoughts on mechanical keyboards? Do you want one? Do you have one? Share your thoughts and experiences in the comments below! 

PC Gaming 101: Part 4: Display Tech Explained

PC gaming is quite big in India. As games become more intense and compelling, gamers find themselves wanting the latest and greatest hardware to run these games smoothly. That being said, the majority of gamers wanting to build or upgrade their machines don’t have much of a clue, and are often at the mercy of vendors and salesmen, due to which, more often than not, they end up making the wrong decisions. This is an attempt to address this lack of information, and help all PC gamers make the best of their resources. This is PC GAMING 101.

When building or upgrading a PC, it’s essential to know what kinds of display outputs it supports. The issue here is there’s no single standard display technology and thus it’s easy to get confused due to the different standards. On the back of your PC or Graphics Card you’ll see a host of different connectors. Let’s see the difference between these different connectors and what kind of display technology you should invest in depending on your needs:

1. VGA

The standard "blue cable" VGA
The standard "blue cable" VGA





The oldest standard in existence, Visual Graphics Array or VGA was first introduced in 1987. This is what usually what comes to mind when someone mentions “display cable”. The standard blue colored 15 pin connector in its typical trapezoid shape, VGA cables carry analog signals and thus the signal quality greatly depends on the quality of the cable. This standard is actually obsolete, and there’s a limit to the resolution VGA cables can support. However, there are a whole lot of Analog monitors and projectors out there, especially in India where VGA is used in most scenarios. If you're




VGA ports are synonymous with "PC Display"
VGA ports are synonymous with "PC Display"


a gamer, and you're rocking a VGA display, it’s probably time for you to invest in a better display, as in the future; higher end Graphics Cards and Gaming PCs will not be compatible. Even today, you'll have to search a lot to find a Graphics card with VGA output. You can always use a VGA adapter, but unless you really can’t upgrade, it’s better to move away from the now seemingly stone-age VGA.


2. DVI

The different kinds of DVI connectors
The different kinds of DVI connectors

The current reigning champion of display outputs, DVI, or Digital Visual Interface is one of the most ubiquitous successors to VGA. DVI comes in different flavors, Namely DVI-D (digital only), DVI-A (analog only), DVI-I (digital and analog). However, DVI-D is what you’ll most probably find and use. DVI marks the beginning of Digital signals being used in Display technology, and offers good compatibility with the older VGA standard, with DVI to VGA adaptors available very easily (for use with DVI-A and DVI-I ).

Because DVI is both backwards and forwards compatible using easy to use adapters, it's very convenient
Because DVI is both backwards and forwards compatible using easy to use adapters, it's very convenient

Most graphics cards come with multiple DVI connectors, and most modern displays have DVI support out of the box. DVI offers higher data rates, support for higher resolutions, and is found in all competent displays and Graphics cards of today. There are two kinds of DVI connectors- single link and dual link. Single link DVI connectors allow you to support a display of 1920×1200 at 60Hz, whereas dual link allows you to support up to either 2560×1600 at 60 Hz (30 inch monitor resolution) or 1920×1200 at 120Hz (for 3D gaming). The absence of analog technology means you're no longer tied down by the cable quality, and unless you're running monitors at a very long distance, any standard cable can get you optimal image quality using DVI. Also, it’s very easy to convert it into other standards, older or newer- all you need is an adapter. However, it’s also getting old now the race for the next generation of high-resolution display technology has begun, and the time when DVI becomes a thing of the past is not too far away.


HDMI is the new, ubiquitous standard
HDMI is the new, ubiquitous standard

As spoken of before, the next generation of high-resolution display technology is upon us, and HDMI or High Definition Multimedia Interface is at the forefront of it. HDMI is basically designed to be a replacement for existing analog video standards. This standard was put together by a lot of big name companies working together, and it comes in different formats and port sizes as well although the standard type-A HDMI cable is used in most TVs and Monitors. It offers uncompressed digital audio and video data in a TV or PC video format. The rise in prominence of HD Televisions and with the “Full HD” moniker being thrown about a lot these days means HDMI is getting a huge push in terms of marketing, and it surely is a competent standard. It’s not exactly a PC display standard, but If you're investing in an HD monitor for your PC it will definitely have HDMI support. The image quality and signal is at most times identical to DVI. It has backward compatibility with DVI, and the connector is much more compact. Currently in version 2.0, HDMI offers a wide gamut of features including support for good old S-RGB, Ethernet, HD- ready Blu-ray or 3D ready TVs, and even 4K resolution at 60 FPS.

4. DisplayPort

DisplayPort- the newest entrant in the Display Interface scenario
DisplayPort- the newest entrant in the Display Interface scenario

The newest of the standards out there, DisplayPort is royalty free, which means that while there’s a royalty behind every HDMI cable that’s produced, DisplayPort based interfaces are free to manufacture without any such royalties which has made it quite lucrative for manufacturers to use. The DisplayPort connectors are surely the easiest to use of the lot- they do away with the old-school screw locking system in VGA, and aren't as insecure as the non-locking HDMI connectors, which are known to disconnect easily due to it. It also offers support for resolution even higher than HDMI, a maximum of 3840×2160 at 60 Hz. Manufacturers have begun to include DisplayPort interfaces in the latest Graphics Cards, and it surely seems like a promising prospect for the future. However, the main issue with DisplayPort is that it isn't compatible with any other display standard, i.e. there aren't any easy to use adapters available that can convert DisplayPort to any other current standard. DisplayPort has two sizes- the standard and mini DisplayPort. Manufacturers often use mini DisplayPort as it takes up much lesser space on the Output Interface of the card, thus making multi-display structures running off the same video card possible.

HDMI vs DisplayPort

So, you’re looking to be on the absolute bleeding edge of display technology and want the latest and greatest display at the highest possible resolution. That narrows down your search to HDMI and DisplayPort. Which standard should you invest in?

When to use HDMI: If you’re looking at a setup that is basically a single screen, running at 1920*1200, and the display is not at much of a distance from the video output, you’re better off using an HDMI cable system. It’s ubiquitous, easy to use and will offer great image quality. However, the lack of a locking mechanism in the HDMI port means that at longer distances the cable has chances of getting loose or coming off entirely.

When to use DisplayPort: If you're one of those gamers that have multiple displays daisy chained in a single system or you want to run a really high-resolution on a gigantic monitor, you're better off using DisplayPort. Also, at higher distances, the secure locking mechanism of DisplayPort means that you're sure of the cable staying put. You can use an HDMI cable for this too, but it’s much easier to connect monitors in multiple mini DisplayPorts than connecting them through an HDMI interface.

For more information about this often ignored aspect of PC gaming, be sure to check out the Wikipedia page of the respective standards for all the technical specs.

Prev>> Part 3: Be a Smart Buyer