Proposed EU measures to extend product lifetime

I am happy to let you know that things are underway within the EU to ensure that products will last longer and will be easier to repair in the future. These are proposed measures at the moment, they’re not law, but they soon could be. The idea, according to the EEB (European Environmental Bureau) is to:

  1. Extend the lifetime of products
  2. Extend the availability of repair services
  3. Improve consumer information and rights
  4. Make these measures binding, not voluntary

If you live within the EU, I encourage you to contact your representatives to the EU Parliament and to ask them to support these proposed measures.

Even if this isn’t law yet, I am happy to see my own feelings on the matter mirrored by those in a position to do something about it. You may recall that I wrote an article called “Truly sustainable computing” back in August of 2015, where I proposed that desktop computers have a projected lifespan of 20 years and laptops and mobiles phones have a projected lifespan of 10 years.

The proposed EU measures would apply to every category of products, not just to computing devices, so things like cars, electronics, appliances would all be covered by the new regulations, ensuring we would once again have quality products that last a long time.

I say “once again” because those of you who are younger than me may not recall we had this sort of thing before the 1970s. The idea of “planned obsolescence” was introduced in the 1960s by manufacturers and that’s when things started to go downhill for products in terms of durability, repairability and build quality. You could still get kitchen appliances made in the late 1960s that looked and worked perfectly even in the late 2000s. You can no longer do that with today’s appliances.

It’s irresponsible in so many ways for us to generate mountains of e-waste every year and it’s doubly irresponsible for manufacturers to make them, one because they’re using the Earth’s resources without any regard for the future and two, because they make them easily breakable and disposable, contributing to the enormous amounts of waste that we generate as a race. It’s time we did something quantifiable and legally binding about this!

On self-driving cars

Here is a video I made about self-driving cars, and which I recorded in our VW Passat Station Wagon, which has a feature called Adaptive Cruise Control. When I got this car, I was amazed at how it could apply braking and acceleration as needed in order to keep the car at a safe distance from those in front of it. The only thing I needed to do was keep my hands on the steering wheel.

Now that self-driving cars are beginning to enter the marketplace, we’ll be able to take our hands completely off the wheel and focus on tasks that are more useful to us, such as reading, looking out the window, talking to our spouse or children, even catching up on sleep.

I’d love to see self-driving cars become mainstream, with the option of turning off the auto-pilot and driving them manually every now and then!

Permanent data storage

We need to focus our efforts on finding more permanent ways to store data. What we have now is inadequate. Hard drives are susceptible to failure, data corruption and data erasure (see effects of EM pulses for example). CDs and DVDs become unreadable after several years and archival-quality optical media also stops working after 10-15 years, not to mention that the hardware itself that reads and writes to media changes so fast that media written in the past may become unreadable in the future simply because there’s nothing to read it anymore. I don’t think digital bits and codecs are a future-proof solution, but I do think imagery (stills or sequences of stills) and text are the way to go. It’s the way past cultures and civilizations have passed on their knowledge. However, we need to move past pictographs on cave walls and cuneiform writing on stone tablets. Our data storage needs are quite large and we need systems that can accommodate these requirements.

We need to be able to read/write data to permanent media that stores it for hundreds, thousands and even tens of thousands of years, so that we don’t lose our collective knowledge, so that future generations can benefit from all our discoveries, study us, find out what worked and what didn’t.

We need to find ways to store our knowledge permanently in ways that can be easily accessed and read in the future. We need to start thinking long-term when it comes to inventing and marketing data storage devices. I hope this post spurs you on to do some thinking of your own about this topic. Who knows what you might invent?

Mobile phones as desktop and laptop replacements

It’s high time we were able to come home and place our mobile phones in a dock that’s connected to a display, keyboard and mouse, and have it turn into a full-fledged desktop and laptop replacement. Mobile phones have sufficient computing power for most of our needs, they have the apps most of us use on desktops as well, and there are incredible energy savings to be had. Hardware manufacturers need to start making sincere, concerted efforts toward this end.

You may also want to read through this post of mine, where I tried my best to use a tablet (an iPad) as my main computer, only to be frustrated to no end by the lack of common ammenities and functionalities we’ve come to expect on desktops and laptops, simple things such as the use of a mouse, drag-and-drop functionality between folders, a finder/file explorer and the ability to easily access drives and files on the network.

I realize that people who engage in heavy computing on a daily basis, such as 4K video editing, 3D graphics and 3D video rendering, large-scale CAD projects, serious coding that requires powerful compilers and other such tasks, will still need very powerful desktop computers and even small server farms in order to do their jobs and I am in no way suggesting that they start using mobile phones to do their work.

We simply have to acknowledge that the majority of the population that uses computers can do just fine with the computing power of a mobile phone. I’m talking about the people who mostly check their email, use social networking sites and apps for social networking sites, plus some online banking and take casual photos and videos. What if all those people were able to use their mobiles phones as replacement desktops or replacement laptops? Wouldn’t that be a significant cost savings to them?

Looking at the greater picture, if all those people, or at least a significant portion of them did this, wouldn’t that translate into significant energy savings for cities, counties, states and countries? Aren’t we always talking about reducing our carbon footprint? Well, instead of using a laptop that consumes about 60W when plugged in, or a desktop that eats up about 200W, give or take, why not use a mobile phone that consumes 3-5W when plugged in?

My video recording setup

After working with this setup for over a year, I wanted to share it with you, having gained full knowledge of its advantages and disadvantages. It is what works for me at this time, in my particular situation, but it may help you as well, if like me, you make your own videos and don’t have a team working behind the scenes.

This 3-4 camera setup is what I use to film my wife’s shows: Ligia’s Kitchen and De Vorbă cu Ligia. For my own videos I use a more basic 1-2 camera setup, since I have to be both in front and behind the camera.

Let me state the advantage and disadvantages first and then I’ll give you the exact list of equipment.

The advantages:

  • The big iPad displays allow for proper framing, focusing and exposure control. I always disliked those tiny screens on DSLRs and video cameras.
  • The iPads have big batteries (except for the iPod Touch) that allow for hours of filming.
  • The featherweight iPod Touch can be mounted  in all sorts of unusual spots (including overhead).
  • Live viewing and control of the video feeds (things such as focus, exposure, white balance), including instantaneous switching between the feeds, from a master iPad. One person can manage all of the cameras at once. This setup allows up to four angles at a time.
  • Lightweight, small and highly portable setup.
  • With the aid of dedicated apps, you can get very good control of the video quality and look, right in the camera, without having to resort to a lot of post-editing.

The disadvantages:

  • Video quality isn’t on par with what you can obtain from a good DSLR with a good lens or better yet, from a dedicated, professional video camera. The dynamic range isn’t there, the noise levels are fairly high, the focus isn’t crisp enough. What you’ll need to do to compensate is to make sure your lighting is as good as you can get it.
  • The battery life of the iPod Touch is terrible. Have an external power source (plug or power bank) readily available if you need to record more than 30-45 minutes of video.
  • Also, the iPod Touch has a much slower processor than the iPad, so don’t attempt to use it as a master controller or for video editing. Use it only as a slave camera and be prepared to wait for good, long times when it updates itself with new Apple software and apps.
  • The on-camera (iPad Air and iPod Touch) microphones don’t offer good sound. The iPad’s microphone is passable from a close distance when nothing else is available, but that of the iPod Touch sounds tinny, no matter the situation. Use shotgun, dedicated or lavalier microphones for better sound.
  • You’ll need good WiFi signal in the room where you’re recording video, if you’re going to want to manage the video feeds from each iPad camera on a master iPad.

The equipment list:

  • 4 iPad Air units: mine are 128 GB first-generation models, I got a great deal on them at B&H Photo about 1 ½ years ago, I think they were old stock and they were making room for the 2nd generation iPads. I went for the largest capacity available because I wanted to be able to record lots of video without needing to stop and download. It just so happened that they also had 4G LTE, which was a nice plus. It was the right decision.
  • 1 iPod Touch unit: I got this because it was small and I wanted to use it for overhead angles, where a heavy camera might fall on my head. I didn’t want to use my iPhone, just in case it ever fell from its rigging. (An iPod is cheaper to replace than an iPhone.) It was the right decision. The short battery life and slow performance were unexpected and disappointing, but it does its job when needed.
  • 5 iOgrapher cases for the iPads and iPod Touch: check out their website, they keep working on their cases and have developed new ones to fit the new gadgets from Apple. I love their cases because they work both handheld (they have two big handles on each side) or mounted on a tripod. And they have mounts for external microphones and lights, right on top where they’re needed.
  • 5-6 iOgrapher lenses: I use a mix of Telephoto and Wide Angle lenses made for the iOgrapher cases, they use a 37 mm mount. They’re not pro-level lenses and they have a bit of distortion and chromatic aberration around the edges, but they’ll do the job.
  • 2 Rode smartLav+ lavalier microphones. This is where our iPhones are useful. We put them on Airplane mode, plug these mics into them, start up the Voice Memos app and slide them in our pockets. We get to record great audio with little fuss.
  • The following iPad/iPod Touch/iPhone apps: the built-in Camera app that comes with iOS, RecoLive MultiCam and Filmic Pro.
  • 1 or more video lights. There are a ton of options here. We use this one. Its advantage is that it comes with interchangeable color filters that shift the temperature of the light.
  • 2 or more softboxes mounted on C-stands for each set.
  • 1 hair light such as this one. I mounted it on a C-stand that I extended to its maximum height and lateral length. You may need to use some sandbags to stabilize the stand.
  • We also use the room’s own lighting for effect and illumination. I tend to use cold temperatures for the studio lights (white CFLs and LEDs) and warm temperatures for the room lights. I know people say you shouldn’t mix light colors when you’re shooting video or photos, but I like it. When they’re mixed the right way, they give me a “live” white balance, an in-studio “look” for my video, which is better than doing it in post.

If you have any questions or if I’ve forgotten to mention anything, let me know in the comments. I hope this helps you!

A comparison of CrashPlan and Backblaze

I’ve been a paying CrashPlan customer since 2012 and my initial backup still hasn’t finished. I’ve been a paying Backblaze customer for less than a month and my initial backup is already complete. 

I’m not a typical customer for backup companies. Most people back up about 1 TB of data or less. The size of my minimum backup set is about 9 TB. If I count all the stuff I want to back up, it’s about 12 TB. And that’s a problem with most backup services.

First, let me say this: I didn’t write this post to trash CrashPlan. Their backup service works and it’s worked well for other members of my family. It just hasn’t worked for me. This is because they only offer a certain amount of bandwidth to each user. It’s called bandwidth throttling and it saves them money in two ways: (1) they end up paying less for their monthly bandwidth (which adds up to a lot for a company offering backup services) and (2) they filter out heavy users like me, who tend to fill up a lot of their drives with unprofitable data. My guess (from my experience with them) is that they throttle heavy users with large backup sets much more than they throttle regular users. The end result of this bandwidth throttling is that, even though I’ve been a customer since 2012 — at first, I was on the individual backup plan, then I switched to the family plan — my initial backup never completed and I was well on track to never completing it.

When I stopped using CrashPlan’s backup services, out of the almost 9 TB of data that I need to back up constantly, I had only managed to upload 0.9 TB in FOUR YEARS. Take a moment and think about that, and then you’ll realize how much bandwidth throttling CrashPlan does on heavy users like me.

Screen Shot 2016-10-20 at 23.37.07.png
After four years of continuous use, I backed up a grand total of 905.7 GB to CrashPlan

To be exact, counting the various versions of my data that had accummulated on the CrashPlan servers in these four years, I had a total of 2.8 TB stored on their servers, but even if you count that as the total, 2.8 TB in FOUR YEARS is still an awfully small amount.

Screen Shot 2016-10-27 at 00.42.14.png
Space used on CrashPlan’s servers: 2.8 TB

Tell me honestly, which one of you wants this kind of service from a backup company? You pay them for years in a row and your initial backup never finishes? If a data loss event occurs and your local backup is gone (say a fire, flood or burglary), you’re pretty much screwed and you’ll only be able to recover a small portion of your data from their servers, even though you’ve been a faithful, paying customer for years… That just isn’t right.

I talked with CrashPlan techs twice in these fours years about this very problematic data throttling. Given that they advertise their service as “unlimited backup”, this is also an ethical issue. The backup isn’t truly unlimited if it’s heavily throttled and you can never back up all of your data. The answer was the same both times, even the wording was the same, making me think it was scripted: they said that in an effort to keep costs affordable, they have to limit the upload speeds of every user. The first time I asked them, they suggested their Business plan has higher upload speeds, so in other words, they tried to upsell me. During both times, they advertised their “seed drive service”, which was a paid product (they stopped offering it this summer). The gist of their paid service was that they shipped asking customers a 1 TB drive so you could back up to it locally, then send it to them to jumpstart the backup. Again, given my needs of backing up at least 9 TB of data, this wasn’t a userful option.

Screen Shot 2016-10-31 at 15.57.25.png
This is false advertising
Screen Shot 2016-10-31 at 15.59.41.png
This is also false advertising

Some of you might perhaps suggest that I didn’t optimize my CrashPlan settings so that I could get the most out of it. I did. I tried everything they suggested in their online support notes. In addition to tricking out my Crashplan install, my computer has been on for virtually all of the last four years, in an effort to help the Crashplan app finish the initial backup, to no avail.

Another thing that bothered me about CrashPlan is that it would go into “maintenance mode” very often, and given the size of my backup set, this would take days, sometimes weeks, during which it wouldn’t back up. It would endlessly churn through its backup versions and compare them to my data, pruning out stuff, doing its own thing and eating up processor cycles with those activities instead of backing up my data.

Screen Shot 2016-10-22 at 19.40.33.png
Synchronizing block information…
Screen Shot 2016-10-23 at 14.39.36.png
Compacting data… for 22.8 days…
Screen Shot 2016-10-23 at 16.58.23.png
Maintaining backup files…

I understand why maintenance of the backups is important. But what I don’t understand is why it took so long. I can’t help thinking that maybe the cause is the Java-based backup engine that CrashPlan uses. It’s not a Mac-native app or a Windows-native app. It’s a Java app wrapped in Mac and Windows app versions. And most Java apps aren’t known for their speed. It’s true, Java apps could be fast, but the developers often get lazy and don’t optimize the code — or that’s the claim made by some experts in online forums.

Another way to look at this situation is that CrashPlan has a “freemium” business model. In other words, their app is free to use for local (DAS or NAS) backup or offsite backup (such as to a friend’s computer). And one thing I know is that you can’t complain about something that’s given freely to you. If it’s free, you either offer constructive criticism or you shut up about it. It’s free and the developers are under no obligation to heed your feedback or to make changes because you say so. As a matter of fact, I used CrashPlan as a free service for local backup for a couple of years before I started paying for their cloud backup service. But it was only after I started paying that I had certain expectations of performance. And in spite of those unmet expectations, I stuck with them for four years, patiently waiting for them to deliver on their promise of “no storage limits, bandwidth throttling or well-engineered excuses”… and they didn’t deliver.

Here I should also say that CrashPlan support is responsive. Even when I was using their free backup service, I could file support tickets and get answers. They always tried to resolve my issues. That’s a good thing. It’s important to point this out, because customer service is an important aspect of a business in the services industry — and online backups are a service.

About three weeks ago, I was talking with Mark Fuccio from Drobo about my issues with CrashPlan and he suggested I try Backblaze, because they truly have no throttling. So I downloaded the Backblaze app (which is a native Mac app, not a Java app), created an account and started to use their service. Lo and behold, the 15-day trial period wasn’t yet over and my backup to their servers was almost complete! I couldn’t believe it! Thank you Mark! 🙂

I optimized the Backblaze settings by allowing it to use as much of my ISP bandwidth as it needed (I have a 100 Mbps connection), and I also bumped the number of backup threads to 10, meaning the Backblaze app could initiate 10 separate instances of itself and upload all 10 instances simultaneously to their servers. I did have to put up with a slightly sluggish computer during the initial backup, but for the first time in many years, I was able to back up all of my critical data to the cloud. I find that truly amazing in and of itself.

Screen Shot 2016-10-14 at 21.36.27.png
This is what I did to optimize my Backblaze installation

As you can see from the image above, I got upload speeds over 100 Mbps when I optimized the backup settings. During most of the days of the initial upload, I actually got speeds in excess of 130 Mbps, which I think is pretty amazing given my situation: I live in Romania and the Backblaze servers are in California, so my data had to go through a lot of internet backbones and through the trans-Atlantic cables.

The short of it is that I signed up for a paid plan with Backblaze and my initial backup completed in about 20 days. Let me state that again: I backed up about 9 TB of data to Backblaze in about 20 days, and I managed to back up only about 1 TB of data to CrashPlan in about 4 years (1420 days). The difference is striking and speaks volumes about the ridiculous amount of throttling that CrashPlan puts in place for heavy users like me.

I also use CrashPlan for local network backup to my Drobo 5N, but I may switch to another app for this as well, for two reasons: it’s slow and it does a lot of maintenance on the backup set and because it doesn’t let me use Drobo shares mapped through the Drobo Dashboard app, which is a more stable way of mapping a Drobo’s network shares. CrashPlan refuses to see those shares and requires me to manually map network shares, which isn’t as stable a connection and leads to share disconnects and multiple mounts, which is something that screws up CrashPlan. I’m trying out Mac Backup Guru, which is a Mac-native app, is pretty fast and does allow me to back up to Drobo Dashboard-mapped shares. If this paragraph doesn’t make sense to you, it’s okay. You probably haven’t run into this issue. If you have, you know what I’m talking about.

Now, none of this stuff matters if you’re a typical user of cloud backup services. If you only have about 1 TB of data or less, any cloud backup service will likely work for you. You’ll be happy with CrashPlan and you’ll be happy with their customer service. But if you’re like me and you have a lot of data to back up, then a service like Backblaze that is truly throttle-free is exactly what you’ll need.

How to create a Fusion Drive on a mid-2011 iMac

Yes, you can enable Fusion Drive on older Macs. I’m not sure how this method will work with Macs older than 2011, but I know for sure that it works on mid-2011 iMacs, and quite possibly on other Macs made since then. I have just completed this process for my iMac and I thought it would help you if I detailed it here.

I like Fusion Drive because it’s simple and automated, like Time Machine. Some geekier Mac users will likely prefer to install an SSD and manually separate the system and app files from the user files which take up the most space, which is something that gives them more control over what works faster and what doesn’t, but that’s a more involved process. Fusion Drive works automatically once you set it up, moving the files that are used more often onto the SSD and keeping the ones that are accessed less often on the hard drive. This results in a big performance increase without having to fiddle with bash commands too much.

The hardware

My machine is a 27″ mid-2011 iMac with a 3.4 GHz processor and 16GB of RAM. I bought it with a 1TB hard drive, which I recently considered upgrading to a 3TB hard drive but decided against, given the fan control issues with the temperature sensor and the special connector used on the factory drive.

imac-basic-specs

I purchased a 128GB Vertex4 SSD from OCZ. It’s a SATA III (6 Gbps) drive and when I look in System Info, my iMac sees it as such and is able to communicate with it at 6 Gbps, which is really nice.

ocz-vertex4-ssd-128gb

ssd-specs

The hardware installation is somewhat involved, as you will need to not only open the iMac but also remove most of the connections and also unseat the motherboard so you can get at the SATA III connector on its back. You will also need a special SATA wire, which is sold as a kit from both OWC and iFixit. The kit includes the suction cups used to remove the screen (held into place with magnets) and a screwdriver set.

2nd-drive-ssd-kit

You can choose to do the installation yourself if you are so inclined, but realize that you may void the warranty on the original hard drive if something goes wrong, and this is according to Apple Tech Support, with whom I checked prior to ordering the kit. Here are a couple of videos that show you how to do this:

In my case, it just so happened that my iMac needed to go in for service (the video card, SuperDrive and display went bad) and while I had it in there, I asked the technicians to install the SSD behind the optical drive for me. This way, my warranty stayed intact. When I got my iMac back home, all I had to do was to format both the original hard drive and the SSD and proceed with enabling the Fusion Drive (make sure to back up thoroughly first). You can opt to do the same, or you can send your computer into OWC for their Turnkey Program, where you can elect to soup it up even more.

The software

Once I had backed up everything thoroughly through Time Machine, I used the instructions in this Macworld article to proceed. There are other articles that describe the same method, and the first man to realize this was doable and blog about it was Patrick Stein, so he definitely deserves a hat tip. I’ll reproduce the steps I used here; feel free to also consult the original articles.

1. Create a Mountain Lion (10.8.2) bootup disk. Use an 8GB or 16GB stick for this, it will allow you to reformat everything on the computer, just to clean things up. Otherwise you may end up with two recovery partitions when you’re done. I used the instructions in this Cult of Mac post to do so. The process involves re-downloading 10.8.2 from the Apple Store (if you haven’t bought it yet, now is the time to do so) and an app called Lion Diskmaker.

2. Format both the original HD and the SSD, just to make sure they’re clean and ready to go. Use Disk Utility to do this, or if you’re more comfortable with the command line, you can also do that (just be aware you can blow away active partitions with it if you’re not careful).

2. List the drives so you can get their correct names. In my case, they were /dev/disk1 and /dev/disk2.

diskutil list

3. Create the Fusion Drive logical volume group. When this completes, you’ll get something called a Core Storage LGV UUID. Copy that number, you’ll need it for the following step.

diskutil coreStorage create myFusionDrive /dev/disk1 /dev/disk2

4. Create the Fusion Drive logical volume. I used the following command:

diskutil coreStorage createVolume paste-lgv-uuid-here jhfs+ "Macintosh HD" 100%

5. Quit Terminal and begin a fresh install of Mountain Lion onto the new disk called “Macintosh HD”.

6. Restore your apps, files and system settings from the Time Machine backup using the Migration Assistant once you’ve booted up. Here’s an article that shows you how to do that. When that completes, you’re done!

The result

Was it worth it? Yes. The boot-up time went from 45-60 seconds to 15 seconds, right away. And over time, the apps and files I use most often will be moved onto the SSD, thus decreasing the amount of time it’ll take to open and save them.

At some point, I expect Apple to issue a utility, like Boot Camp, that will allow us to do this more easily and automatically. Until then, that’s how I set up Fusion Drive on my iMac, and I hope it’s been helpful to you!

The 2012 Fisker Karma

The Fisker Karma is an interesting and appealing car that’s fully electric, with its battery charged by a gasoline engine, so you’re never out of power.

The more I learn about it, the more I like it. Things such as its sexy, uncompromising design, the fact that it’s made out of renewable and recycled materials, the shapes, colors and textures of its interior, its solar roof, its low, muscular stance, its long wheel base with big wheels, all make it very special.

It’s made by Fisker Automotive, it is the vision of one man, it was first designed, then engineered, and I highly encourage you to find out more about it.

I’ve posted an image gallery and a few videos below. Enjoy!

One more thing: I’ve created a new page on Facebook called “The Elegant Gentleman“, where I talk about clothing, manners and the finer things in life. Head on over and give it a like to be kept up to date with my posts there. Thanks!

A couple of suggestions for Waze

Waze

I’ve been using Waze for over a month and I love it. If you haven’t tried it yet, you should. It’s surprisingly accurate, even in a country where you wouldn’t think there’d be a lot of users, like Romania.

The traffic updates can get a little overwhelming in large urban areas like Bucharest and sometimes it doesn’t find an address I need, but overall, it’s a wonderful app and the idea of a user-driven (and updated) map is awesome. Live traffic alerts and automatic calculation of the best route based on current traffic conditions are awesome options (these used to cost a pretty penny with GPS devices and weren’t very good nor up-to-date).

Here’s a way to make Waze better: use the accelerometer in our iPhones to automatically determine if the road is unsafe, based on braking, swerving, stopping and yes, even driving (or falling) through potholes. I love being able to report a road incident but when I’m swerving through potholes and recently dug up roads (like the one between Medias and Sighisoara), I don’t have the time nor the multitasking brain cycles to tap on my phone and report a hole in the road. So doing this automatically and reporting it to the users would be a wonderful new addition to Waze. I’d love to get an alert on my phone as I’m driving through fog or rain, when the visibility isn’t great, telling me there’s a pothole ahead. And by the way, Waze, have you thought about hooking up weather info to the traffic reports?

One thing that always annoyed me with GPS devices is the constant repetition of stuff like “take the 2nd exit” or “turn left”. The new version of Waze seems to be doing the same thing. I’d love an option in the settings where I could specify that I’d like to be reminded about such things a maximum of two times (not 3 or 4 times…)

A big thanks to the Waze team for the awesome work!

Hardware preview: ioSafe N2 NAS

ioSafe, the company famous for its line of rugged external drives that can withstand disasters such as floods, fires and even crushing weight, has come up with a new product: the N2 NAS (Network Attached Storage) device.

The N2 device comes at the right time. The market for NAS devices is maturing and demand is growing. Western Digital has even come out with a line of hard drives, the WD Red, specifically targeted to NAS enclosures. To my knowledge there is no such other NAS device out there, so ioSafe’s got the lead on this.

The N2 appliance is powered by Synology® DiskStation Manager (DSM) and is aimed at the SOHO, SMB and Remote Office Branch Office (ROBO) markets.

The high performance 2-bay N2 provides up to 8TB of storage capacity and is equipped with a 2GHz Marvel CPU and 512MB of memory. The N2 uses redundant hard drives as well as ioSafe’s patented DataCast, HydroSafe and FloSafe technologies to protect data from loss in fire up to 1550°F and submersion in fresh or salt water up to a 10 foot depth for 3 days.

Features:

  • Local and Remote File Sharing: Between virtually any device from any location online
  • Cloud Station: File syncing between multiple computers and N2 (like Dropbox)
  • iTunes Server
  • Surveillance Station: Video surveillance application
  • Media Server: Stream videos and music
  • Photo Sharing: Photo sharing with friends and family
  • Mail Server: Email server
  • VPN Server: Manage Virtual Private Network
  • Download Station: Post files for others to download
  • Audio Station: Stream audio to smartphone (iOS/Android)
  • FTP Server: Remote file transfers
  • Multi-platform compatibility with Mac/PC/MS Server/Linux

Hardware:

  • Dual Redundant Disk, RAID 0/1, Up to 8TB (4TB x 2)
  • 2GHz Marvel CPU and 512MB memory
  • Gigabit Ethernet Port
  • Additional ports for USB 3, SD Memory Card
  • User replaceable drives
  • Protects Data From Fire: DataCast Technology. 1550°F, 1/2 hr per ASTM E119 with no data loss.
  • Protects Data From Flood: HydroSafe Technology. Full immersion, 10 ft. 3 days with no data loss.
  • FloSafe Vent Technology: Active air cooling during normal operation. FloSafe Vents automatically block destructive heat during fire by water vaporization – no moving parts
  • Physical theft protection (optional floor mount, padlock door security – coming Q1 2013)
  • Kensington® Lock Compatible

Support and Data Recovery Service (DRS):

  • 1 Year No-Hassle Warranty (for N2 Diskless)
  • 1 Year No-Hassle Warranty + Data Recovery Service (DRS) Standard (for loaded N2)
  • DRS included $2500/TB for forensic recovery costs for any reason if required
  • DRS and Warranty are upgradeable to 5 years ($.99/TB per month)
  • DRS Pro available includes $5000/TB + coverage of attached server ($2.99/TB per month)

Operating Environment:

  • Operating: 0-35° C (95°F)
  • Non-operating: 0-1550°F, 1/2 hr per ASTM E119
  • Operating Humidity: 20% – 80% (non-condensing)
  • Non-operating Humidity: 100%, Full immersion, 10 feet, 3 days, fresh or salt water

Physical:

  • Size: 5.9″W x 9.0″H x 11.5″L
  • Weight: 28 lbs

The N2 appliance is being brought to market with funding obtained through IndieGogo. I know it’s hard to believe it when you look at their products, but ioSafe only has about 20 employees. Sometimes they have to be creative in the ways they fund their R&D.

The ioSafe N2 will begin shipping in January 2013 and will be available in capacities up to 8TB. Introductory pricing for the ioSafe N2 diskless version is available for $499 on Indiegogo ($100 off the retail price of $599.99) if you want to get your own hard drives.

I’ve also written about ioSafe Solo, the ioSafe Rugged Portable and the ioSafe SSD devices.

A look at lubrication inside an engine, in 1937

Nowadays, when we rarely pop open the hoods of our cars on our own, and the only things we usually worry about are putting in the gas and taking them to the dealer or the mechanic for their scheduled maintenance, we can hardly fathom what goes on inside those modern engines.

This wonderful video, made in 1937 by Chevrolet, shows how a typical engine was lubricated and is guaranteed to amaze you. I bet you had no idea that an engine we think of as primitive, given its almost 100 years of age, is so complex.

At the same time, the video will give you a new appreciation for what goes on inside your car’s engine. I bet it’s even more complex now.

Are you inclined to take your car for granted now? It is a marvel of modern engineering, isn’t it?

If you’d like to see more videos like these, subscribe to the US Auto Industry channel on YouTube.

A folding, fully electric city car

Hiriko is the name of this new foldable ultra-compact car, which is great for cramped city driving (and parking). It can turn sideways and fold upwards, reducing its wheelbase and allowing it to squeeze into spots where normal cars just can’t go. And it’s also 100% electric. From the videos (posted below) I can see a solar panel on the roof, meaning it’ll be able to charge at least partially while you’re on the go. Other details are hard to come by on their website (can’t find the specs), but I do know that it’ll go on sale next year for 12,700 Euro.

Via MediaFax