Tag Archives: google

the Google Apps Logo

Field Notes: Google Two-Factor Authentication

the Google Apps LogoThere’s been a lot of discussion recently about GMail’s two-factor authentication thanks to the Matt Honan hack publicity. I’ve been using it awhile and figured I’d share my thoughts and experiences. I had been using it for an account that I just used for email so it wasn’t much of a hassle. But I recently added it to a second Google account and it’s been more of a hassle. It’s probably needed more on this account, since it’s used for more than just email so I’ve kept it enabled. In the case of Google the two-factors are a password (something I know) and something I have (my phone).

Here are my notes from using Google’s Two Factor Authentication. For the record, I used my own domain with Google Apps accounts in both cases.

There’s plenty of backup available should I lose or break the phone with the authenticator app:

  • I have the Google App on my iPhone so I don’t need a cellular connection to get the code.
  • As backup I have another phone set up to get the code via SMS.
  • As another backup there’s also printable one-time backup codes. The assumption is that Google can keep these codes secure.

If an app or device doesn’t recognize Google Two-Factor authentication there are Application Specific passwords:

  • “Application Specific” is more a description of the intent, rather than a technical requirement.
  • The Application Specific passwords can be used on multiple devices and applications. I’d prefer they be locked to the first app or device they’re used on.
  • If you use the password in a malicious or poorly written app the password can be used my someone else to access your email. So common sense still needs to be used when using the application passwords.
  • While 16 characters is a long password it’s not as complex as it could be, All passwords are 16 characters and there seems to be a limited character set. While this could be more secure, it would still be extremely hard to crack and isn’t a reason not to use them.
  • The application specific passwords only provide limited access to the account, even if compromised, such as accessing email.
  • Application specific passwords are easy to revoke so they can be used to try out a new app and then revoked if the app isn’t used.
  • I’ve had some issues where my iPhone email (for example) decides it needs a new app password and I have to re-enter it. This is a pain as I have to go to the website and generate a new one then type it into the iPhone.
  • While I can see the last time the application password was used, I can’t tell where it is used, so if the password is taken I wouldn’t notice, unless I stopped using it.

Misc Notes:

  • The initial setup is a bit of a pain. When two-factor authentication is turned on all the existing logons will break and have to be redone.
  • PCs can be made “trusted” and then for the next 30 days it won’t be necessary to enter the code when logging on.
  • If Google Sync is used (in Google Chrome) it’s necessary to use a encryption passphrase specific to Google Sync, the account password can’t be used since an application specific password is required. Well actually, an app specific password can be used, but it would have to be remembered and used as the app password for all Google Chrome logons, which goes against the design of the application passwords.

Anyone else using Google two-factor authentication? What’s been your experience?

Picture of trees covered with October Snow

The OSQuest Trail Log #65: October Blizzard Edition

Picture of trees covered with October SnowAnd mother nature keeps right on attacking, Not content to wait until, winter officially starts mother nature decided to hit Connecticut with some nice, heavy snow clinging to all the picturesque October foliage. Eventually bringing much of that foliage, and the limbs it was on, crashing to the ground and bringing along power lines for good measure. I lost power on Saturday and just got it back Thursday, with cable/internet following on Friday. So I went through gadget withdrawal for a few days. The picture above is from Saturday afternoon after a couple hours of snow and before the trees started coming down. Luckily the ones around me missed cars and buildings. While not everyone was so lucky, I was pretty suprised by how many downed trees there were that managed to find open areas rather than other nearby targets. But on the the tech…

The highlight of the month for me was my first podcast as a guest on the Home Server Show podcast.

New Software

I installed CrashPlan backup on Windows Home Server, taking advantage of a discount offering unlimited online backup for $42/year instead of the normal $49. The backups been working well although I’d been hoping to get a bunch up there in October since I had plenty of space left in my cap this month, The power outage ruined that. It’s uploaded just over 54GB with about 15GB in the queue that started uploading once the internet came alive. But it’ll be later in November before I add much more. I want to avoid having to throttle myself by using too much early on. I figure I can do about 100GB a month and stay under my cap but want to play it safe.

I moved from using Untangle as both a router and unified threat manager (UTM) to using pfSense as a router but leaving Untangle as the UTM. I’ve been happy with the results and was just beginning to dig into some of the features during the snow-shortened weekend. I’ve started digging into pfSense a bit, more poking around than R & D. I also plan to do some testing to see if a caching proxy will reduce the bandwidth I use. I figure I need to make sure it will cache software and patches in order to make a dent in the bandwidth I use. (The cache in Untangle didn’t actually serve much from it’s cache when I tested it.)

Updated Software

It seems like everything I use was upgraded. But the highlights were…

I put Lion on my desktop Mac Mini only to find my upgrade fear imposed delay was unwarranted. Everything worked find with only a minor Synergy frustration due to Lion’s new feature where mouse movement doesn’t wake it. I’ve no plans to put Lion on anything besides this Mini and my Air. The other Macs have no reason to upgrade or the upgrade will remove features I use.

iOS is also updated to iOS5 of course. Despite some frustrations I managed to get both my iPhone 4 and iPad 2 upgraded. I’ve experienced shorter battery life for sure although no where near as bad as some complaints I read. I tend to keep things turned off and I hadn’t enabled much of iCloud. I saw the worst performance the days I was home and had a wireless connection. Despite typical usage the battery drained far faster than when I was in the office with wireless on but no network to connect to. No scientific test but typical usage each day. By 5PM at home the battery would be around 20% while usually above 50% in the office. But I just read Apple has an update in beta that’s supposed to help.

Not really software, but Google Reader saw an update. I use Google Reader on my computers (with the account being used by iPad apps). The timing was bad, the update came as I was grabbing some battery powered 3G access during the blackout so the last thing I wanted was change. I’ve trying to avoid hating it just because it’s different. I didn’t use any of the discontinued features so no complaints there. But I found it easy to blow through a bunch of articles and star ones I want to read later. That’s become a problem. Ignoring the performance problems (very uneven scrolling) the star isn’t in the same place in every post. It’s now at the end of the title, rather than the first things. And speaking of the titles, while it has a clean look the article titles blend right in. Sure the star at the bottom of the post is always at the very left, but it’s hard to find and I star based on title. I’ll be looking for a new desktop reader and use my iPad more to check through the feeds.

Google+

While I still maintain my status as the last human not on Facebook, I finally broke down and joined Google+ when they enabled it for Google Apps users. It wasn’t much later that I lost power so I don’t have much to say about it yet. I did find the Google+ iOS app doesn’t like Google Apps users and tells me to go get an invite when I log on, but the web interface is fine from the iPhone.

iCloud

iCloud was making news and I moved my MobileMe account to iCloud. I wasn’t a big MobileMe user having been burned by Apple’s cloud services in the past. I think my problem with iCloud (beyond not trusting Apple to keep things running) will be that it requires users to dive into the deep end, accept the way it works, and don’t expect a lot of options. I gave photo stream a try. Problem was my camera fumbling uploaded more bad pictures than intended pictures. Not really an iCloud problem, but still a problem. But I have no doubt it will improve over time and I’ll be drawn into the iCloud.

Web Work

I spent some time with the plumbing of the website. I seems like I have a bunch of minor issues that I can’t seem to get to (or keep putting off). At least I was able to tackle a logrotate configuration change. I also changed my caching plugin back to WP-Supercache. I always liked the plugin but stopped using it after it broke due to an upgrade. It’s working again and I’m using it. I did make an errant mouse click and enabled compression which didn’t work (possibly because I have compression enable in Apache). Unfortunately it went unnoticed in my testing and I didn’t notice a problem until my views went way down.

My change to WP-Supercache seemed to cause another problem which went unnoticed until recently. It doesn’t seem to have been rampant, but it was frequent. I don’t quit understand the problem completely but I don’t think it was a WP-Supercache bug. In short my Adsense ads would often display a “Page Not Found” error in the frame for the ad. I set up the ads to only display ads to new visitors. View a few pages over the course of a couple months and the ads are supposed to go away. I think WP-Supercache would sometimes cache the “don’t display” page which would cause a problem when a new “display the ad” visitor arrived and the code ran to display the ad.

Home Networking

I had been hoping I had pfSense and dynamic DNS setup to handle IP address changes so I could remotely access multiple home servers using my own domain. Well, when I got my internet back today Comcast gave me a new IP address as I hoped. But alas, no update to DNS, So it’s time to do some more research. I’ll be tackling that this weekend. I’m hoping I just have a pfSense setting wrong. [Update: Got this working so hopefully a write-up soon.]

The Month Ahead

The only thing I really want to get done is getting those home servers set up with Dynamic DNS and pfSense. [Update: Rather easy fix so hopefully a write-up soon] After that I’ll see what catches my interest. I’m also hoping this past week isn’t a sample of what’s to come this winter.

Quick Bits - Commentary placard

Google’s “Hack”

Quick Bits - Commentary placardThis is the non-story that just won’t go away. The big bad Google drove around stealing data by “hacking” people’s wireless network. Articles such as those at the Huffington Post contain quotes such as “one of the most massive surveillance incidents by a private corporation that has ever occurred.”

Google collects a hell of a lot of data that concerns me more than this. In this case what seems to have happened is Google collected data from unsecured wireless networks as it’s street view vehicles drove around. The real lesson here is Secure Your Wireless Networks! Even if the network was protected by the easily breakable WEP encryption Google would not have gotten the data.

Of more concern than the actual data being collected was that Google used some library code in a project without knowing what it did. I have concerns about how Google collects and uses data along with a big concern about mistakes that could expose that data.

Google’s explanation rings true. It doesn’t make me feel any better about Google’s ability to avoid mistakes, but it doesn’t make me any more worried about Google’s intentions. But our politicians now have an event they can latch onto and appear to be cracking down on privacy. What would make me more concerned is that some governments have requested copies of the data rather than telling Google to destroy it. Luckily some governments (Ireland) have it right and have had Google destroy the data.

Google’s bungling attempt to try and make email social via Buzz concerned me. Google using code they didn’t understand concerns me. Google collecting data that’s already flowing wide open in the air doesn’t concern me.

Sure, they couldn’t do it on the scale of Google, but criminals could drive around doing the same thing, and then using the data they collect.

Update: Well, it looks like Google was throwing away encrypted data and keeping the unencrypted stuff. Still, it’s more worrisome to me that seems to be a mistake or careless rather than some attempt to collect info.

image of WWW on gold

Google DNS – Close But No Cigar

image of WWW on goldAmong Google’s recent announcements was their introduction of Google Public DNS. I’ve been using OpenDNS and have no complaints. Well, actually I recently found I had defaulted back to using my ISP’s DNS (Comcast), probably during a router firmware upgrade. When I switched to back OpenDNS I also didn’t notice a different over Comcast. I wouldn’t have noticed unless I was in the router config for another reason and happened to see it.

Comcast and OpenDNS both do typo hijacking and display a search page with ads rather than an error page. I went through the process of opting out of Comcast’s typo hijacking. OpenDNS also allows an opt-out for typo hijacking which I have set. Interestingly enough, the advertising company – Google, doesn’t hijack typos for ads and they display the error page for typos. But this lack of hijacking wasn’t a benefit for me since my opt-outs were already in place and were working fine.

To be honest I didn’t notice any performance difference when I was set to use any of them. When I first switched from Comcast to OpenDNS long ago I did notice imroved performance, but not this time. So I went looking for a way to benchmark performance and came across namebench. It’s simple to use and provides useful information.

Just download namebench and run the executable. You’ll be presented the following screen:

namebench main screen

The “Benchmark Data Source” is a drop down that let’s you pick one of your browsers or the Alexa Top Global Domains as a data source. Picking your most used browser provides results that are specific to the way you browse. Some people have complained that this could send all your browsing history to one person (the Google developer). Since the source code is public it’s easy to confirm it doesn’t. But, if your still concerned, picking Alexa will use generic sites.

Click “Start Benchmark” to get things going. Once the benchmarking is done (took about 10 minutes for me) a page with the results will open in your browser. At the top will be the information yiu really want:

namebenchresults

The above result is from a run after I’d already re-configured for it’s previous recommendations and OpenDNS is the second fastest DNS server according to the benchmark. The right box displays the recommended DNS servers that should be used. In my case the first one is the internal IP of my local router so should be ignored. (I didn’t include it in the screenshot but you’ll get detailed info on the servers tested. See the previously linked namebench page for samples.

The bottom line is Google Public DNS didn’t make the cut. So, while the accuracy of the benchmark may be questioned (as would any benchmark) it’s pretty clear there’s no Google favoritism. M5Net, UltraDNS and Comcast were my recommended DNS servers. Another note, because of caching the first time run of namebench will deliver the most accurate results.

So, I started off by looking at Google Public DNS but by the time I was done I was off of it. But looking into it I considered the following:

  • This gives yet more of my information to Google, which at it’s core is an advertising company. Their privacy policy is pretty good and Google hasn’t monetized DNS yet. Of all the info Google has on me, my DNS info is probably less of a concern. Let’s face it, someone is going to have this data. It’s Google’s recent cavalier comments about privacy and all the other info they have that’s a concern.
  • Google doesn’t have to match the info to me to benefit. The additional information they collect about were people surf and how often is a treasure waiting to be mined. They don’t need to put ads on error pages to profit from DNS.
  • Google does continuously hit on speeding up the web so it’s likely they’ll keep improving performance. They have studies showing that slow response on their search results generates lower revenue.
  • They also promote security and Google certainly has the money and talent to keep DNS as secure as possible.

Like my recent foray into Google’s Picasa/Eye-Fi deal, Google Public DNS is yet another Google offering that sounded good but wasn’t quit right for me. Like Picasa, Google DNS will stay on my radar and I’ll check it out sometime down the road. Anyone else trying Google Public DNS?

image of a compact digital camera

Google Wants Our Photos In The Cloud

image of a compact=Google currently has a deal going that offers a free Eye-Fi card when you lease 200GB of storage for them for a year. When I first saw it it seemed like a pretty good deal, and I hate to pass up a good deal. But it’s less of a deal if I don’t really need the space and won’t use the card. So that got me thinking about my options.

The space is split between Gmail and Picasa. I’m not even close to my Gmail limit and I’m not currently a Picasa user. In theory there’s also some unofficial hacks that allow the space to be used for file storage, like gDisk for the Mac. But I’m not willing to trust something Google may break at anytime so it’s not a consideration. What I’d be looking to use the disk for is to back up my photos. Right now I have just under 20GB of photos and it costs me less than $3/mth to keep them backed up offsite. So that’s $36/year, still shy of the $50.

But that assumes I could easily save everything up to Picasa and I found that wouldn’t be possible. The Picasa 3 desktop allows automatic syncing of it’s albums to albums on Picasa web albums. But this proved to be problematic and not a better solution than plain old backup via Jungle Disk. The deal-breakers were:

  • Picasa is limited to 1,000 albums with up to 1,000 photos in each album. This sounds like a lot but the 1,000 album limit is a deal breaker for me. I keep my files in a directory structure and the number of directories already exceed 1,000. I don’t want to do any drag or dropping to create new albums just for syncing since that’s prone to error. Sure, I have plenty of directories with one or two photos, but I don’t want to re-organize everything , I’m set in my ways.
  • Deleting entire albums from Picasa desktop did not delete the album from the web. Photos within albums deleted just fine. Deleting all pictures in a folder automatically deleted the folder so it’s not like I could keep the folder behind until it synced the deletions.
  • RAW image files were synced to the web as jpg’s so it wouldn’t be a true backup.

While a lot of people like Picasa, there was nothing that caught my attention and would compel me to use it. I’ll keep looking at it and may yet find some compelling feature, but for now I’d have a hard time justifying 200GB for Picasa. Realistically I’d be better off with a lower priced plan.

Then there’s the Eye-Fi card. If it was worth the cost then I could consider $50 for the card and the Google storage as the free product. The version offered is the Eye-Fi Home Video which has a list price of $69. I don’t find it online anywhere for a street price. The closest card is the Eye-Fi Share Video which sells for $73 at Amazon. If I had to guess I’d say the “catch” is that that since the Home Video card doesn’t typically include any online component the only online options are Picasa and YouTube. These are the only online services specifically mentioned in the offer. The Share Video allows sharing with more services. Other, more expensive, cards include geo tagging photos which would add a potentially useful feature.

I like the idea of being able to automatically load pictures from my camera to my PC automatically, but the Wi-Fi card doesn’t offer anything else that’s compelling to me.

So while the Google/Wi_fi offer does seem like a good deal I’m not yet convinced it’s worth $50 to me. I’m still intrigued by Picasa and the web album component so I’ll keep considering it.

I also decided to looks at some alternatives:

  • SmugMug offers online albums along with a “SmugVault” that can be used to store any type of file (such as RAW files) but it’s a subscription service and would cost more than what I have now.
  • The old standby Flickr is $25/yr for unlimited storage. Still, it’s not a good solution for backup. There are plenty off Flickr add-ins and plug-ins so I could probably find one to do syncing, but it still wouldn’t be a true backup.
  • I already use Windows Live Photo Gallery to organize my photos and like it. Plus there’s a free 25GB for online photo albums. But like the others, it’s lacking as a backup solution.

So, the bottom line is Jungle Disk remains the way I backup my photos. I’m really not surprised since it’s cheap and easy. Picasa still has my attention if I want to do some online albums and the Eye-Fi card would offer some convenience. But I’d probably want the version that does geo tagging (although I haven’t done any research to see how well it does that). I may spend the $50 bucks in a moment of weakness since it is a good deal, but for now I won’t be clicking the button to upgrade storage and order the card.