Well I heard from my corporate rep today at Rogers that “we are getting the new Lumia 1520 in the coming weeks”! Great news for us Nokia lovers!
So you’re running SBS 2011, and recently you notice (or an end user reports) that when trying to log in to your SBS 2011 Remote Web Workplace (RWW) you receive:
404 – File or directory not found.
The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.
You check your server, all is good. You test internally, and all is good. Absolutely no errors! What’s going on?
Well, as Microsoft pushes out updates to it’s Internet Explorer web browser (and with users upgrading to Windows 8, or Windows 8.1), the compatibility with the Remote Web Workplace is broken and/or lost.
To fix this, you need to add your RWW site to your Internet Explorer Compatibility list:
1) Open Internet Explorer, and go to your Remote Web Workplace login page. (DO NOT LOG IN YET)
2) Press the “Alt” button which brings up the internet explorer menus
3) Drop down “Tools” and then go to “Compatibility View Settings”.
4) Your internet domain should be in the “Add this website” box, just press the “Add” button, then hit Close.
5) Close out of Internet Explorer, and then go back in and try getting on remotely.
Note: If you clear your internet history, you will lose the above settings and have to re set them!
And BAM! It should now work without any problems whatsoever!
Nokia Canada has this awesome Facebook contest running right now where if you describe the new Lumia 1020 in one word, you can win a Lumia 1020 prize pack which includes:
-Nokia Lumia 1020
-Nokia Camera Grip
-JBL PlayUp Portable Wireless Speaker
-3 Month Nokia Music+ Subscription
To enter, “Like” the Nokia Canada Facebook page here: https://www.facebook.com/nokiacanada and fill out the contest application form here: https://www.facebook.com/nokiacanada?sk=app_331406573613073
The good news? I already won one of the prize packs!!! Yesterday was my birthday and around 3:00PM I received a message from Nokia Canada telling me I won one of the prize packs! Woohoo!
I’m TOTALLY excited about the phone. I’ve been harassing my Rogers reps for a while wondering when I can pre-order a 1020. This is just icing on the cake baby!
Once I get my hands on it, I’ll be writing an in-depth review on the device. I already have the feeling I’m going to love it!
Most of you have heard about Shaw’s announcement in the past regarding their new Fiber to the Curb, or Fiber to the Premise offering, however for some reason there are no pictures, or documented customers that actually claim to have this service.
Well, I can officially say that one of my clients now has the Fiber to the Premise offering for businesses.
This all started out with me being brought on board to provide them with Managed Services. One of the main problems we’ve been having is with the current internet connection (I’m not going to mention who provides it) and how horrible the speeds and reliability are. One of my first initiatives was to see if there was any alternatives. Unfortunately, due to their location (The Foothills Industrial Area), Shaw coax was not available. I sourced out numerous other providers and we were just about to switch to a wireless internet service provider, until I decided to call Shaw one last time a week before we pulled the trigger.
To my surprise, they mentioned they just launched their Fiber offering for small businesses. The offering provided their basic coax internet service tiers and pricing, however it was provided over fiber. This is EXTREMELY attractive due to the reliability, and pricing! We had the option to go all the way to the Business Internet 250 package. Higher products were available, however these were way more expensive, included SLAs, and just wasn’t what we needed. My client opted for the Business Internet 100 package.
This morning the Shaw guys showed up, quickly brought the fiber in to the office, mounted the equipment, and we were up in running in no time (and as always they were EXTREMELY friendly, clean, and took care in setting everything up). I love Shaw for those of you who don’t know…
Anyways, here’s some pics! I’ll update this post in a week or two with average speeds.
The above picture, is the first device the Fiber plugs in to. I don’t know it’s exact purpose, but I believe it provides Shaw’s coax network over the fiber line. The coax cable then went to a Shaw Home Phone Cable modem for 2 phone lines. I believe the device also repeats, and provides a fiber connection to the Shaw Fiber modem as pictured below.
Recently I needed to upgrade and replace my storage system which provides basic SMB dump file services, iSCSI, and NFS to my internal network and vSphere cluster. As most of you know, in the past I have traditionally created and configured my own storage systems. For the most part this has worked fantastic, especially with the NFS and iSCSI target services being provided and built in to the Linux OS (iSCSI thanks to Lio-Target).
A few reasons for the upgrade: 1) I need more storage, and 2) I need a pre-packaged product that comes with warranty. Taking care of the storage size was easy (buy more drives), however I needed to find a pre-packaged product that fits my requirements for performance, capabilities, stability, support, and of course warranty. iSCSI and NFS support was an absolute must!
Some time ago, when I first started working with Lio-Target before it was incorporated and merged in to the linux kernel, I noticed that the parent company Rising Tide Systems mentioned they also provided the target for numerous NAS and SAN devices available on the market, Synology being one of them. I never thought anything of this as back then I wasn’t interesting in purchasing a pre-packaged product, until my search for a new storage system.
Upon researching, I found that Synology released their 2013 line of products. These products had a focus on vSphere compatibility, performance, and redundant network connections (either through Trunking/Link aggregation, or MPIO iSCSI connections).
The device that caught my attention for my purpose was the DS1813+.
Synology DS1813+ Specifications:
- CPU Frequency : Dual Core 2.13GHz
- Floating Point
- Memory : DDR3 2GB (Expandable, up to 4GB)
- Internal HDD/SSD : 3.5″ or 2.5″ SATA(II) X 8 (Hard drive not included)
- Max Internal Capacity : 32TB (8 X 4TB HDD) (Capacity may vary by RAID types) (See All Supported HDD)
- Hot Swappable HDD
- External HDD Interface : USB 3.0 Port X 2, USB 2.0 Port X 4, eSATA Port X 2
- Size (HxWxD) : 157 X 340 X 233 mm
- Weight : 5.21kg
- LAN : Gigabit X 4
- Link Aggregation
- Wake on LAN/WAN
- System Fan : 120x120mm X2
- Easy Replacement System Fan
- Wireless Support (dongle)
- Noise Level : 24.1 dB(A)
- Power Recovery
- Power Supply : 250W
- AC Input Power Voltage : 100V to 240V AC
- Power Frequency : 50/60 Hz, Single Phase
- Power Consumption : 75.19W (Access); 34.12W (HDD Hibernation);
- Operating Temperature : 5°C to 35°C (40°F to 95°F)
- Storage Temperature : -10°C to 70°C (15°F to 155°F)
- Relative Humidity : 5% to 95% RH
- Maximum Operating Altitude : 6,500 feet
- Certification : FCC Class B, CE Class B, BSMI Class B
- Warranty : 3 Years
This puppy has 4 gigabit LAN ports, and 8 SATA bays. There’s tons of reviews on the internet praising Synology, and their DSM operating system (based on embedded linux) on the internet, so I decided to live dangerously and went ahead and placed an order for this device, along with 8 X Seagate 3TB Barracuda drives.
Unfortunately, it’s extremely difficult to get your hands on a DS1813+ in Canada (I’m not sure why). After numerous orders placed and cancelled with numerous companies, I finally found a distributor who was able to get me one. I’ll just say the wait was totally worth it. Initially I also purchased the 2GB RAM add-on as well, so I had this available when the DS1813+ arrived.
I was hoping to take a bunch of pictures, and do thorough testing with the unit before throwing it in to production, however right from the get go, it was extremely easy to configure and use, so right away I had it running in production. Sorry for the lack of pics!
I did however get a chance to setup the 8 drives in RAID 5, and configured an iSCSI block based target. The performance was fantastic, no problems whatsoever. Even maxing out one gigabit connection, the resources of the unit were barely touched.
I’m VERY impressed with the DSM operating system. Everything is clearly spelled out, and you have very detailed control of the device and all services. Configuration of SMB shares, iSCSI targets, and NFS exports is extremely simple, yet allows you to configure advanced features.
After testing out the iSCSI performance, I decided to get the unit ready for production. I created 2 shared folders, and exported these via NFS to my ESXi hosts. It was very simple, quick, and the ESXi hosts had absolutely no problems connecting to the exports.
One thing that really blew me away about this unit, is the performance. Immediately after configuring the NFS exports, mounting them and using Storage vMotion to migrate 14 live virtual machines to the DS1813+ I noticed MASSIVE performance gains. The performance gains were so large, it put my old custom storage system to shame. And this is really interesting, considering my old storage system, while custom, is actually spec’d way higher then the storage unit (CPU, RAM, and the SATA controller). I’m assuming the DS1813+ has numerous kernel optimizations for storage, and at the same time does not have the overhead of a fully Linux distribution. This also means it’s more stable since you don’t have tons of applications running in the background that you don’t need.
After migrating the VMs I noticed that the virtual machines were running way faster, and were may more responsive. I’m assuming this is due to increased IOPS.
Either way I’m extremely happy with the device and fully recommend it. I’ll be posting more blog articles later detailing configuration of services in detail such as iSCSI, NFS, and some other things. I’m already planning on picking up an additional DS1513+ (5 bay unit) to act as a storage server for VM backups which I perform using GhettoVCB.
Nice job Synology
As most of you know, I’m a huge fan of the Microsoft Surface Pro tablet. I’ve been using it since day 1 of the release and absolutely love it. This thing has become such a valuable tool in my life, if anything were to happen to it, I’d replace it in a flash.
Since I’ve had mine, I’ve had numerous clients ask about it. After demo’ing the device, most have actually gone out and pulled the trigger. They all compare it to their various old tablets, and say hands down the Surface Pro is #1.
Recently one of my clients, Larry Wellspring at Synterra Technologies Ltd. (a leading seismic consulting company located here in Calgary, Alberta and a long time client of mine) was thinking of purchasing one so he didn’t have to lunk around his high performance laptop. One of the most important questions he had were the spec’s of the device and if it could handle the seismic software applications he and his business use. Since the Surface Pro is essentially a higher performing computer in the form factor of a tablet, I said chances are it would work. He went out and bought one.
For the most part, most applications worked right off the bat. However we had a few issues with Omni 3D from Gedco. The application would install fine, however we were receiving errors when launching the application:
The application was unable to start correctly (0xc0150002). Click OK to close the application.
We tried contacting Omni 3D support, however they mentioned running Omni 3D on Windows 8 was unsupported and untested, especially running it on a Tablet. They mentioned they’ve never recalled getting Omni 3D to run on a tablet. Well, we wanted to make history!
Trying different compatibility configurations had no affect. Ultimately, I researched the error and noticed it had something to do with C++ runtime’s. Although none of the posts had a solution to our problem, it at least pointed us in the right direction. I noticed we already had the 64-bit and 32-bit C++ 2010 runtime’s installed (I believe a different application installed these), so first and foremost, I re-installed these. It had no affect. I then decided to try installing the C++ 2008 run time installs. In our case, we installed the 64-bit version of Omni 3D, so I installed the 64-bit version of the Microsoft Visual C++ 2008 Runtime components available here.
After installing this, we went to open up Omni 3D and it worked!
Keep in mind that this should not only work and apply to Surface Pro tablets, but to anyone trying to install Omni 3D on Windows 8.
Back in February, I was approached by a company that had multiple offices. They wanted my company to come in and implement a system that allowed them to share information, share files, communicate, use their line of business applications, and be easily manageable.
The first thing that always comes to mind is Microsoft Small Business Server 2011. However, what made this environment interesting is that they had two branch offices in addition to their headquarters all in different cities. One of their branch offices had 8+ users working out of it, and one only had a couple, with their main headquarters having 5+ users.
Usually when administrators think of SBS, they think of a single server (two server with the premium add-on) solution that provides a small business with up to 75 users with a stable, enterprise feature packed, IT infrastructure.
SBS 2011 Includes:
Windows Server 2008 R2 Standard
Exchange Server 2010
Microsoft SharePoint Foundation 2010
Microsoft SQL Server 2008 R2 Express
Windows Server Update Services
(And an additional Server 2008 R2 license with Microsoft SQL Server 2008 R2 Standard if the premium add-on is purchased)
Essentially this is all a small business typically needs, even if they have powerful line of business applications.
One misconception about Windows Small Business Server is the limitation of having a single domain controller. IT professionals often think that you cannot have any more domain controllers in an SBS environment. This actually isn’t true. SBS does allow multiple domain controllers, as long as there is a single forest, and not multiple domains. You can have a backup domain controller, and you can have multiple RODCs (Read Only Domain Controller), as long as the primary Active Directory roles stay with the SBS primary domain controller. You can have as many global catalogs as you’d like! As long as you pay for the proper licenses of all the additional servers
This is where this came in handy. While I’ve known about this for some time, this was the first time I was attempting at putting something like this in to production.
The plan was to setup SBS 2011 Premium at the HQ along with a second server at the HQ hosting their SQL, line of business applications, and Remote desktop Services (formerly Terminal Services) applications. Their HQ would be sitting behind an Astaro Security Gateway 220 (Sophos UTM).
The SBS 2011 Premium (2 Servers) setup at the HQ office will provide:
-Active Directory services
-DHCP and DNS Services
-Printing and file services (to the HQ and all branch offices)
-”My Document” and “Desktop” redirection for client computers/users
-SQL DB services for LoB’s
-Remote Desktop Services (Terminal Services) to push applications out in to the field
The first branch office, will have a Windows Server 2008 R2 server, promoted to a Read Only Domain Controller (RODC), sitting behind an Astaro Security Gateway 110. The Astaro Security Gateway’s would establish a site-to-site branch VPN between the two offices and route the appropriate subnets. At the first branch office, there is issues with connectivity (they’re in the middle of nowhere), so they will have two internet connections with two separate ISPs (1 line of sight long range wireless backhaul, and one simple ADSL connection) which the ASG 110 will provide load balancing and fault tolerance.
The RODC at the first branch office will provide:
-Active Directory services for (cached) user logon and authentication
-Printing and file services (for both HQ and branch offices)
-DHCP and DNS services
-”My Documents” and “Desktop” redirection for client computers/users.
-WSUS replica server (replicates approvals and updates from WSUS on the SBS server at the main office).
-Exchange access (via the VPN connection)
Users at the first branch office will be accessing file shares located both on their local RODC, along with file shares located on the HQ server in Calgary. The main wireless backhaul has more then enough bandwidth to support SMB (Samba) shares over the VPN connection. After testing, it turns out the backup ADSL connection also handles this fairly well for the types of files they will be accessing.
The second branch office, will have an Astaro RED device (Remote Ethernet Device). The Astaro/Sophos RED devices, act as a remote ethernet port for your Astaro Security Gateways. Once configured, it’s as if the ASG at the HQ has an ethernet cable running to the branch office. It’s similar to a VPN, however (I could be wrong) I think it uses EoIP (Ethernet over IP). The second branch doesn’t require a domain controller due to the small number of users. As far as this branch office goes, this is the last we’ll talk about it as there’s no special configuration required for these guys.
The second branch office will have the following services:
-DHCP (via the ASG 220 in Calgary)
-DNS (via the main HQ SBS server)
-File and print services (via the HQ SBS server and other branch server)
-”My Document” and “Desktop” redirection (over the WAN via the HQ SBS server)
-Exchange access (via the Astaro RED device)
For all the servers, we chose HP hardware as always! The main SBS server, along with the RODC were brand new HP Proliant ML350p Gen8s. The second server at the HQ (running the premium add-on) is a re-purposed HP ML110 G7. I always configure iLo on all servers (especially remote servers) just so I can troubleshoot issues in the event of an emergency if the OS is down.
So now that we’ve gone through the plan. I’ll explain how this was all implemented.
- Configure and setup a typical SBS 2011 environment. I’m going to assume you already know how to do this. You’ll need to install the OS. Run through the SBS configuration wizards, enable all the proper firewall rules, configure users, install applicable server applications, etc…
- Configure the premium add-on. Install the Remote Desktop Services role (please note that you’ll need to purchase RDS CAL’s as they aren’t included with SBS). You can skip this step if you don’t plan on using RDS or the premium server at the main site.
- Configure all the Astaro devices. Configure a Router to Router VPN connection. Create the applicable firewall rules to allow traffic. You probably know this, but make sure both networks have their own subnet and are routing the separate subnets properly.
- Install Windows Server 2008 R2 on to the target RODC box (please note, in my case, I had to purchase an additional Server 2008 license since I was already using the premium add-on at the HQ site. (If you purchase the premium add-on, but aren’t using it at your main office, you can use this license at the remote site).
- Make sure the VPN is working and the servers can communicate with each other.
- Promote the target RODC to a read only domain controller. You can launch the famous dcpromo. Make sure you check the “Read Only domain controller” option when you promote the server.
- You now have a working environment.
- Join computers using the SBS connect wizard. (DO NOT LOG ON AS THE REMOTE USERS UNTIL YOU READ THIS ENTIRE DOCUMENT)
I did all the above steps at my office and configured the servers before deploying them at the client site.
You essentially have a working basic network. Now to get to the tricky stuff! This tricky stuff is to enable folder redirection at the branch site to their own server (instead of the SBS server), and get them their own WSUS replica server.
Now to the fancy stuff!
1. Installing WSUS on the RODC using the add role feature in Windows Server: You have to remember that RODC’s are exactly what they say! !READ ONLY! (As far as Active directory goes)! Installing WSUS on a RODC will fail off the bat. It will report that access is denied when trying to create certain security groups. You’ll have to manually create these two groups in Active Directory on your primary SBS server to get it to work:
Replace RODCSERVERNAME with the computer name of your RODC Server. You’ll actually notice that two similiar groups already exist (with the server name different) for the existing Windows SBS WSUS install, this existing groups are for the main WSUS server. After creating these groups, this will allow it to install. After this is complete, follow through the WSUS configuration wizard to configure it as a replica for your primary SBS WSUS server.
2. One BIG thing to keep in mind is that with RODC’s you need to configure what accounts (both user and computer) are allowed to be “cached”. Cached credentials allow the RODC to authenticate computers and users in the event the primary domain controller is down. If you do not configure this, if the internet goes down, or the primary domain controller isn’t available, no one will be able to log in to their computers or access network resources at the branch site. When you promoted the server to a RODC, two groups were created in Active Directory: Allow RODC Cached Logins, and Deny RODC Cached Logins (I could be wrong on the exact name since I’m going off memory). You can’t just select and add users to these groups, you need to also select and add the computers they use as well since computers have their own “computer account” in Active Directory.
To overcome this, create two security groups under their respective existing groups. One group will be for users of the branch office, the other group will be for computers of the branch office. Make sure to add applicable users and groups as members of the security groups. Now go to the “Allow RODC Cached Logins” group created by the dc promotion, and add those two new security groups to that group. This will allow remote users and remote computers to authenticate using cached security credentials. PLEASE NOTE: DO NOT CACHE YOUR ADMINISTRATIVE ACCOUNT!!! Instead, create a separate administrative account for that remote office and cache that.
3. One of the sweet things about SBS is all the pre-configured Group policy objects that enable the automatic configuration of the WSUS server, folder redirection, and a bunch of other great stuff. You have to keep in mind that off of the above config, if left alone up to this point, the computers in the branch office will use the folder redirection settings and WSUS settings from the main office. Remote users folder redirection (whatever you have selected, in my case My Documents and Desktop redirection) locations will be stored on the main HQ server. If you’re alright with this and not concerned about the size of the user folders, you can leave this. What I needed to do (for reasons of simple disaster recovery purposes) is have the folder re-directions for the branch office users store the redirection on their own local branch server. Also, we need to have the computers connect to the local branch WSUS server as well (we don’t want each computer pulling updates over the VPN connection as this will use up tons of bandwidth). What’s really neat is when users open applications via RemoteApp (over RDS), if they export files to their desktop inside of RemoteApp, it’ll actually be immediately available on their computer desktop since the RDS server is using these GPOs.
To do this, we’ll need to duplicate and modify a couple of the default GPOs, and also create some OU (Organizational Unit) containers inside of Active Directory so we can apply the new GPOs to them.
First, under “SBSComputers” create an OU called “Branch01Comps” (or call it whatever you want). Then under “SBSUsers” create an OU called “Branch01Users”. Now keep in mind you want to have this fully configured before any users log on for the first time. All of this configuration should be done AFTER the computer is joined (using the SBS connect) to the domain and AFTER the users are configured, but BEFORE the user logs in for the first time. Move the branch office computer accounts to the new Branch office computers OU, and move the Branch office user accounts to the Branch office users OU.
Now open up the Group policy Management Management Console. You want to duplicate 2 GPOs: Update Services Common Settings Policy (rename the duplicate to “Branch Update Services Common Settings Policy” or something), and Small Business Server Folder Redirection Policy (rename the duplicate to “Branch Folder Redirection” or something).
Link the new duplicated Update Services policy to the Branch Computers OU we just created, and link the new duplicated folder redirection to the new users policy we just created.
Modify the duplicated server update policy to reflect the address of the new branch WSUS replica server. Computers at the branch office will now pull updates from that server.
As for Folder redirection, it’s a bit tricky. You’ll need to create a share (with full share access to all users), and then set special file permissions on the folder that you shared (info available at http://technet.microsoft.com/en-us/library/cc736916%28v=ws.10%29.aspx). On top of that, you’ll need to find a way to actually create the child users folders under that share/folder in which you created. I did this by going in to active directory, opening each remote user, and setting their profile variable to the file share. When I hit apply this would create a folder with their username with the applicable permissions under that share, after this was done, I would undo that variable setting and the directory created would stay. Repeat this for each remote user at that specific branch office. You’ll also need to do this each time you add a new user if they bring on more staff, you’ll also need to add all new computers and new users to the appropriate OUs, and security groups we’ve created above.
FINALLY you can now go in to the GPO you duplicated for Branch Folder redirection. Modify the GPO to reflect the new storage path for the redirection objects you want (just a matter of changing the server name).
4. Configure Active Directory Sites and Services. You’ll need to go in to Active Directory Sites and Services and configure sites for each subnet you have (you main HQ subnet, branch 1 subent, and branch 2 subnet), and set the applicable domain controller to those sites. In my case, I created 3 sites, and configured the HQ subnet and second branch to authenticate off the main SBS PDC, and configured the first branch (with their own RODC) to authenticate off their own RODC. Essentially, this tells the computers which domain controller they should be authenticating against.
And you’re done! (I don’t think I’ve forgotten anything). Few things to remember, whenever adding new users and/or computers to the branch, ALWAYS join using SBS wizard, add computer to the branch OU, add user to the branch OU, create the users master redirection folder using the profile var in the AD user object, and separately add both user and computer accounts as members of the security group we created to cache credentials.
And remember, always always always test your configuration before throwing it out in to production. In my case, I got it running first try without any problems, but I let it run as a test environment for over a month before deploying to production!
We’ve had this environment running for months now and it’s working great. What’s even cooler is how well the Astaro Security Gateway (Sophos UTM) is handling the multiple WAN connections during failures, it’s super slick!
Well, woke up 10 minutes ago to find that my Nokia Lumia 900 notified me that new Windows Phone updates were available. Notification is for Windows Phone OS Version 7.10.8860.142 on my Canadian Rogers Branded Nokia Lumia 900.
I’ve tried using Google to find out what the update includes, however information is limited. After installing this update, another new update was also available and automatically started to install: OS Version 7.10.8862.144). Right now I’m just finishing up the second.
You’ll notice how the cancel button is usable on the first update, while it’s not on the second. I’d bet money on the fact that the second update is in fact a firmware update versus software update (or maybe both).
I’m thinking one of these updates contains bug fixes for the live tiles and other fixes, while the other may fix the Bluetooth Sharing app. Let me know if any of you notice any additional fixes/features. Happy Updating! I’ll update when I find things out and finish the updates.
UPDATE: Just finished installing both updates. Bluetooth sharing still does not work (says I have to do a update on the phone, however no more updates are available). Can’t confirm this fixes the Live Tiles “Issue” (I’ve never had the issue so I can’t comment).
I have long awaited the release of the Microsoft Surface Pro since their first initial announcement about entering the tablet market. The first device released: “Surface RT” was a lightweight, thin, powerful tablet that could run Metro apps, along with Microsoft Office and had a battery life of continuous use around 10 hours. The second device released: “Surface Pro” was a new device that didn’t fall under either distinction of a Tablet or Laptop but could be used as either, that was a powerful portable computer that could run all your applications, along with the Metro apps, be easily transported, used anywhere, and had a decent battery life (~4 hours of heavy use, I’ve gotten over 8 hours of battery use).
Being an I.T. professional, I figured I would wait for the Surface Pro to be released since I believed I’d mostly be using normal Windows applications over the “Metro” style apps. I’ve been running Windows 8 on my desktop since Microsoft made it available to partners mid way through 2012. During that time, once I tried to configure and use the Metro apps, but using them with non-touch interface was weird enough for me not to end up using any. I usually stay on the desktop, and when needed to launch a program I simply hit the start button, type the first few letters of the program, hit enter, and it launches.
First off I want to start off addressing Windows 8 being used as a tablet interface. It’s slick! Since receiving my Surface Pro, even after installing Microsoft Outlook and other desktop applications I regularly use, I found that over time, I never even go in to the desktop. Using the Metro interface with touch capabilities is simply brilliant. It’s very easy to use, navigate, configure, and surprisingly enough I find that 98% of everything I do can be accomplished via Metro style applications. I don’t even go in to Microsoft Outlook anymore since I have my Exchange account configured with the Mail Metro app. Occasionally I might use Outlook, but it’s only to do advanced tasks such as deal with Meeting requests that I need to add info, or dealing with numerous attachments, etc… The Windows 8 touch interface is beautiful and resembles Windows Phone to the tee.
Briefly visiting the desktop aspect, the desktop is your familiar Windows desktop, with the modification of no start menu since it’s running Windows 8. On the Surface Pro you can install any Windows Application, and they run great. This device has the power to run most graphic intensive games, drawing applications, and anything else you can throw at it. It works great and I have no complaints. One thing to note is that Microsoft implemented scaling since the display is a true 1080p display, and with such a small screen the writing wouldn’t be visible for those with bad eyes. I so far have not had any issues with the scaling, and applications look great.
Now going back to the Metro style interface, there are numerous apps available. Use of the Mail, People, Calendar, Internet Explorer, etc.. all work great. I use these all the time and haven’t had any issues. They are perfect for working with you exchange account, browsing the internet, talking with Facebook friends, tweeting on twitter, browsing internet forums, etc… Again, everything works great, no problems whatsoever, and you can accomplish plenty using these.
A few apps to mention, Xbox Music is fantastic. I’ve been using my Zune pass since I purchased my first Windows Phone 7 (Nokia Lumia 900), and have been creating playlists, downloading music on the fly, and absolutely love it. I also use it all the time on my desktop computer as well. When first playing with my new Surface, it was very easy to configure my xbox music pass, and it actually sync’ed all my music from my other devices to my new Surface once I enabled the feature. It’s fantastic, and now I often find myself listening to music whenever working wherever I am in my house, or on weekends when I’m doing work/implementations at clients offices. It’s super slick!
Another app that has come in handy for me, being an IT professional, is the Remote Desktop app. Whenever rolling out updates to clients, or working on my own servers, it’s awesome being able to establish numerous RDP sessions, and switch between them on the fly. It’s just that simple… It’s actually faster to use the Metro style Remote Desktop app, then it is to use the native Windows application.
The amount of apps I use is actually endless, so it’s pointless going in to detail for each and every one of them. The native Windows 8 metro apps are just awesome. One other app I actually do have to mention that is a particular favorite of mine is “Package Tracker”. I regularly sell, ship, and send items to/from clients, and it’s great being able to track all the packages in a simple to use interface. What’s even slicker, is having Package Tracker linked to my SkyDrive account, so packages I’m tracking will be sync’ed between my Surface Pro, and my Lumia 900 Windows Phone.
Now on to actual physical characteristic of the Surface Pro. The device is thin (thinner then I’d expect for a fully working high performance computer), and it’s built using great materials. It feels great in the hand, and the use of the kickstand is great! They have two separate types of keyboards/covers for both Surface models. I’m using the Touch keyboard and love it, it takes advantage of pressure applied to the keys of a pad with printed letters on it, also has a fully working track-pad. The other option available is the Type keyboard, which actually has mechanical keys on it, for those of you who prefer that. I haven’t seen or played with a Type cover, but the Touch cover is great for typing, using as a screen protector when mobile, and when flipping it backward the keys are disabled so you can’t accidently trigger any of the buttons (and in the 2 weeks I’ve had mine, it’s been working flawlessly).
The Surface pro also comes with a pen that you can use for marking up documents, taking notes (really cool to use in Microsoft OneNote), and also as a mouse when you want something more accurate then lets say your finger. Now Microsoft really shined with implementing this, you can actually rest your hand on the screen while writing with the pen, and since the Surface Pro recognizes the pen is near/present, it will disable any touch input from your hand. I tried as hard as I could to mess it up, but again flawless every time. The pen also has a magnet mounted on the side so it actually attaches to the tablet when mobile. At first I thought it would fall off easily when moving around, getting in/out of the car, etc… But it’s been rock solid and I haven’t had any accidents where it’s come off except when I actually want to remove it and use it.
As for some other random hardware notes, the surface comes equipped with a USB 3.0 port, and a Mini Display-port. I’ve used the display port to play movies from the Surface to my 1080p television and it was slick. Quality was amazing.
One major contribution that the Surface has given me, is the capabilities to work during meetings, have information readily available, and take notes. The device is so small, that when you meet with someone, and use it to take notes or reference material, that it’s not obtrusive if setup between you and the person you’re meeting with. Normally I have “Internet Sharing” setup on my Windows Phone, connect to my corporate VPN and I can access documents on the fly, generate invoices from QuickBooks, prepare quotes on the fly, and pretty much have access to any information, when I need it. I can’t tell you how amazing it is, to have all this information at your fingertips in such a nice little package.
Now to one of the biggest conclusions I’ve come to since using the Surface Pro, after realizing I use mostly Metro style apps, I could have actually gotten away with using a Surface RT instead. 90% of the day to day work I do could be done on the Surface RT. I actually plan on purchasing a Surface RT soon, and use the RT for day-to-day meetings, web surfing, music, web browsing, etc… And then use my Surface Pro for when I require a full computer, implementations, work at clients offices, when I require the use of Windows Desktop applications.
Overall I’m very impressed with this device, it’s slick, beautiful, and has increased my productivity. Perfect for everyday business or everyday personal use. I’ve demo’d the device to numerous clients (over 10) and they all love it and plan on purchasing one when stock is available.
Now as for my only complaint: The Surface Pro does not have LTE capabilities. This is somewhat of an annoyance since I regularly connect to my corporate VPN for network resource. Although it’s an annoyance, you can easily work around it by either using a LTE USB data card, or using “Internet Sharing” on your Windows Phone.
Recently it was time to refresh a client’s disaster recovery solution. We were getting ready to release our dependance on our 5 year old HP MSL2024 with an LTO-4 tape drive, and implement a new HP MSL2024 library with a SAS LTO-6 tape drive. We need to use tape since the size of the backup requirements for a full back up are over 6TB.
The server that is connected to all this equipment is an HP Proliant DL360 G6 with a HP Smart Array P800 Controller. The P800 already has an HP StorageWorks MSA60 unit attached to it with 12 drive
Documentation for the P800 mentioned tape drive support. While I know that the P800 is only capable of 3Gb/sec, this is more that enough and chances are the hard drive will be maxed out reading anyways.
Anyways, client approved purchase, brought in the hardware and installed it. First we had to install Backup Exec 2012 (since only the 2012 SP1a HCL specifies support for LTO-6), which was messy but we did it. Then we re-configured all of our backup jobs, since the old jobs were migrated horribly.
When trying to run our first backup, the backup failed. I tried again numerous times, only to get these errors:
- Storage device “HP 07″ reported an error on a request to rewind the media.
- Final error: 0xe00084f0 – The device timed out.
- Storage device “HP 07″ reported an error on a request to write data to media.
- Storage device “HP 6″ reported an error on a request to write data to media.
- PvlDrive::DisableAccess() – ReserveDevice failed, offline device
- ERROR = 0x0000001F (ERROR_GEN_FAILURE)
Also, every time the backup would fail, the Library and the Tape drive would disappear from the computers “Device Manager”. Essentially the device would lose it’s connection. Even when logging in to the HP MSL2024 web interface, it would state the SAS port is disconnected after a backup job would fail. To resolve this, you’d have to restart the library and restart the Backup Exec services. One interesting thing, when this occurred, my companies monitoring and management software would report a RAID failure had occured at the customers site, until the MSL was restarted (this was kinda cool).
I immediately called HP support. They mentioned the library had a firmware up 5.80 and asked to try to update. We did and it failed since the firmware file didn’t match it’s checksum, I was told that this is not important as 5.90 doesn’t contain any major changes. We continued to spend 6 hours on the phone trying to disable insight agents, check drivers, etc… Finally he decided to replace the tape drive.
Since LTO-6 is brand new technology, even with a 4 hour response contract, it took HP around 2 weeks to replace the drive since none were in Canada. During this time, I called two other separate times. The second tech told me that at the moment, no HP controllers support the HP LTO-6 tape drives (you’re kidding me right?), and the 3rd said he couldn’t provide me any information as there’s nothing in the documentation that specifies what controllers were compatible. All 3 tech’s mentioned that having the P800 controller in the server host both the MSA60 and the MSL2024 is probably causing the issues.
We received the new tape drive, tested, and the backups failed. I sent the drive back (which was a repaired unit, and kept the original brand new one). After this I tried numerous things, google’d for days. Finally I was just about to quote the client a new controller card, when I finally decided to give HP another call.
On this call, he escalated the issue to engineers. Later that night I received an e-mail stating that library firmware 5.90 is required for support for the LTO-6 tape drives. I was shocked, angry, etc… It turns out that library firmware 5.80 was “Recalled” due to major issues a while back.
Since LTT couldn’t load the firmware, I just downloaded it manually and flashed it via the MSL 2024 web interface. After this restarted the Backup Exec services, performed an inventory, and did a minor backup (around 130GB). Keep in mind that when the backups originally failed, it didn’t matter the size, the backup would simply fail just before it completed.
The backup completed! Later on that night I ran a full complete backup of 5TB (2 servers and 2 MSA60s) and it completely 100% successfully. Even with the MSA60 under extreme load maxing out the drives, this did not in any way impede performance of the LTO-6 tape drive/library.
So please, if you’re having this issue consider the following:
1. Tape library must be at firmware version 5.90 to support LTO-6 Tape drives. Always always always make sure you have the latest firmware.
2. I have a working configuration of a P800 controlling both an HP MSA60, and a HP MSL 2024 backup library and it’s working 100%
3. Make sure you have Backup Exec 2012 SP1a installed as it’s required for LTO-6 compatibility (make sure you read about the major changes upgrading to 2012 first, I can’t stress this enough!!!)
I hope this helps some of you out there as this was consuming my life for numerous weeks.