Virtual Desktops and Applications are not as important as you think

We spend a lot of time optimizing how we deliver desktops and applications. But, the future will trivialize these things as the focus shifts the focus to data and the data moves to the cloud. It is happening now in the consumer market and the trend will infiltrate the enterprise as BYOD continues to become more prevalent.

binary

We spend a lot of time optimizing how we deliver desktops and applications. But, the future will trivialize these things as the focus shifts the focus to data and the data moves to the cloud.  It is happening now in the consumer market and the trend will infiltrate the enterprise as BYOD continues to become more prevalent.

Applications are useless without data

Take a word processing application for example. Without data, the word processing application is really stupid. As soon as you start typing a document, you are creating data.  The application is just a mechanism for creating or consuming data.  And, in a lot of cases, this data is interchangeable between applications.  In other words, it is the data that is important – not the application.

Applications are just a data access layer

I like programming and a staple of any good programming language is object orientation. One of my favorite design patterns is called a façade pattern. The definition of façade is “the face of” something. So a façade in this case is the face of an object. The “face” is easily switched out, but the underlying objects remain the same (loose coupling).

Let’s relate this to applications. Applications are the façade to the data. Let’s take our word processing example again. I personally use Microsoft Word (for Windows and Mac), Google Docs, and iWork Pages. All these programs access the same data and they are easily switched out, so they are the face of the data. Dropbox (and more recently GDrive) keep my data where I need it, so those things are completely different facades to the exact same data. I can get my data on a fat client, web, mobile, offline, etc.

Consumerization and Cloud is driving the trend

Mobile devices have exploded and BYOD is gaining popularity. These 2 trends are directly associated with user demand – not IT demand. Brian Madden put it well on ConsumerizeIT.com – “The new reality: The IT department has to compete against every random app & website out there!”

Let’s look at another example that is more consumer related.  When I purchase an online book from Amazon, I’m paying for the data (book).  I then have a choice of interchangeable applications that can be used as the face of the data – many versions of the Kindle, multiple iOS applications, multiple Android applications, multiple Windows applications, etc.)  The application is much less important than the data.  Consumers are getting used to freedom of choice and everything “just working.”  The same will creep into the enterprise and IT will need to cope.

So now what?

Windows applications aren’t going anywhere anytime soon, so we’ll still be delivering those for quite some time. However, I see where the application is less important and we’ll be thinking more about data security, offline data, application and platform choice (Windows, Mac, Linux, mobile, web, etc.), data integrity, etc.  Or, perhaps I’m way off here.  What do you think?

Further Reading

VDI, ok? What’s next? Stephane Thirion http://www.archy.net/2012/05/31/vdi-ok-whats-next/

The VDI Party is Over. Benny Tritsch http://drtritsch.com/2012/05/the-vdi-party-is-over

It is official. I have joined Splunk!

With the new year comes a new direction for me – I have officially joined Splunk as a Solutions Architect.

I am very happy to announce that I have officially joined the team at Splunk as a Solutions Architect.  As many of you may already know, my website is a place where I share a few articles and a lot of open-source code.  Most of the code I have distributed from this website is mainly focused on giving you insight into what is going on in your Citrix environments (Web Interface for Resource Manager, Web Interface Access Control Center, Configuration Logging, Project Raley, etc.)  Although I get ideas for these projects from the community and my own professional experiences, all of the code has been developed on my own personal time as a hobby and community contribution.

Turning a Hobby and a Passion into a Job

All that being said, joining Splunk allows me to turn a hobby and a passion into a full-time job.  On top of that, I get to work with a close friend, co-presenter, and fellow CTP/MVP Brandon Shell on a daily basis.  We will be working on building Splunk solutions around the Citrix Enterprise stack.

What Lies Ahead

This website is going to stick around and I’ll be writing more articles and open-source code.  I also plan on writing some Splunk articles here and on the official Splunk blogs.  I’m looking forward to what’s ahead!

News from Citrix Synergy

I’m blogging the Citrix Synergy keynote in real time. I’ll be adding to things that strike me as interesting to this post as announcements are made.

I’m blogging the Citrix Synergy keynote in real time.  I’ll be adding to things that strike me as interesting to this post as announcements are made.  Unfortunately, I didn’t think about making a crazy cool AJAX out of band update, so you’ll have to refresh your browser.  You can follow me on Twitter at http://twitter.com/JasonConger as well for snippets.

XenClient

In beta for a while, XenClient is now released to the public.  The vision of a client hypervisor is moving toward offline VDI.  Citrix will use a technology called Synchronizer to keep the virtual desktop on the XenClient client machine synchronized with a remote copy in the data center.  Another cool feature is the ability to send a “kill pill” to the XenClient VM.  I think this is a very significant announcement.  One word of caution, there is a limited subset of workstations that XenClient supports.

HDX Nitro

There are a lot of announcements surrounding HDX.  HDX “Nitro” includes several sub technologies:

  • Project Mach 3 will give you 3x performance
  • Project Laser is for printing performance
  • Project Zoom is application pre-launch for instant connection
  • Project Mercury is for WAN acceleration

Check out more on HDX Nitro at http://hdx.citrix.com/nitro

Dazzle

Dazzle was shown at the last 2 Synergy conferences.  I like the idea of where Dazzle wants to go – user self-service, but it still lacks some functionality.  Workflow approval has been shown as a demo, but still doesn’t exist yet.  I also think some of the excessive eye candy animations need to go.  Check it out for yourself here: http://www.citrix.com/tv/#search/dazzle

Twitter updates from the event:

My other Blogs

When I first started this site, it was my goal to write technical articles and have a platform to distribute custom written software. I think I have kept true to that, but that makes for sparse updates. What you may or may not know is that I do contribute to 2 other blogs out there on a more regular basis

When I first started this site, it was my goal to write technical articles and have a platform to distribute custom written software.  I think I have kept true to that, but that makes for sparse updates.  What you may or may not know is that I do contribute to 2 other blogs out there on a more regular basis:

 

  • blog.Xcentric.com
    • I just started blogging here (I currently work for Xcentric as a system architect).  My goal with this blog is to focus on high level concepts surrounding virtualization, cloud computing, application delivery, etc.

 

I still try to keep this site focused on extending virtual environments.  What that means is I focus on tearing down software to get to the nuts and bolts and then try to figure out ways to uniquely extend the boundaries.

Citrix turns 20

Citrix turns 20 today. In this post, I will tell you how I got introduced to Citrix several years ago.

Citrix Citrix is 20 years old today (still under the drinking age in the Unites States). I remember when I was first introduced to Citrix. It was back around 1999 when I used to work for an integrator. Most of the work I did was Novell related(yes I was a CNE back in the day). Anyway, one of the projects that came down the pipeline was something totally different. My role in the project was to visit a multitude of nursing homes throughout Mississippi and Louisiana, install a Cisco router and switch (I was actually a CCNA then too), install NICs in computers, and install this thing called a Citrix client on the PCs. My final test was to make sure the Citrix client could “talk” to the Citrix server in the main location. I didn’t really realize what was going on with this Citrix thing until about the 5th install I did. It was then that I started to think this was some pretty cool stuff. By the way, this was all WinFrame 1.x (based on Windows NT 3.51).

I started to learn more about Citrix and later got certified as a CCEA. The integrator I was working for became part of the Citrix channel and I soon found myself working with some really cool SEs like Barry Flanagan. Barry still tells a story from the first time we met – I showed him where I installed a Java Citrix client on a Novell server (Novell supported Java by then). I was like – “Barry, look – you can run NWadmin directly from the Novell server console (NWadmin was a Windows app you used to administer Novell servers).” I guess he thought that was cool, because he still talks about it.

A lot has happened since then. I went to work for Citrix in Fort Lauderdale for a while. I started this Citrix-focused website. Citrix greatly diversified their product portfolio. I’ve written thousands of lines of code (for free) to extend Citrix products. I became a CTP. And, I’ve done quite a bit of technical public speaking (mostly Citrix focused).

So, I guess it is a good thing that I had to travel to all those nursing homes to setup that Citrix thing. By the way, I still remember some of those roads in Louisiana. I remember driving down “Devil’s Swap Road”. I also had to drive down a road that didn’t have a name, but I was told I would recognize it because the sugar cane was really tall that time of year. Also, as it turns out, nursing home food isn’t that bad.

The Epoch Information Center is now online

Use the Epoch Information Center to get epoch offsets from Date/Time values, or get Date/Time values from an epoch offset. This information is useful for things like manipulating the Terminal Server shadow registry.

The what is now online? The Epoch Information Center. Okay, so what is an epoch and why should I care in a Terminal Server environment?

What is an epoch?
Epoch has many definitions. The definition we’re interested in is the UNIX definition. The UNIX definition specifies that the epoch is January 1, 1970 at midnight (UTC) (a.k.a. 01-01-1970 00:00:00). This Date/Time value can be considered “Time 0”. The epoch is used as a beacon to measure time in many computer programs (such as the Windows registry). Time can be measured as the number of seconds since the epoch or before the epoch (returning a positive or negative offset). This also allows you to compare Date/Time values.

Why should I care about an epoch in a Terminal Server environment?
There is one main reason you should care about the epoch offset in a Terminal Server environment. That reason is the shadow registry (HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Terminal Server\Install\Software). Brian Madden has written an excellent article on the shadow registry key and how it’s used. I’ve used his article and the Epoch Information Center to manipulate the shadow registry’s “LatestRegistryKey” value in a couple of scenarios.

Usage Scenarios

Scenario 1 (setting the LatestRegistryKey back in time):
The first usage scenario is addressed in Brian Madden’s article. This scenario involves setting the “LatestRegistryKey” value back in time.

Quote from Brian Madden’s article:

. . . Imagine that you have multiple Terminal Servers or Citrix servers that are all running the same application, such as Microsoft Office. If you’re like most companies, you probably installed Office on those servers a several months or even years ago. Since Office was originally installed via a user session operating in install mode, the server’s shadow key was created and timestamped with the original install time. Any users who did not have the appropriate Office settings in their own personal HKCU key received them when they logged on to the server for the first time.

Now, fast forward to today. What happens if you want to add more capacity to your server farm? Most likely you’d build a few more servers, install Office on them, and load-balance them with your existing servers.

Can you see the problem here? The timestamp on the shadow key area of the new Terminal Servers will be current, and in fact much newer than each user’s individual HKCU last update timestamp. Therefore, upon logging on to one of the new servers, userinit.exe will start comparing the key-by-key timestamps of all the HCKU\Software keys to the server’s shadow area software keys. Since the server just got a fresh install of Microsoft Office, those timestamps will be much newer than any of the user’s personalized settings in their own HKCU. The result? Mass deletion of all the Microsoft Office-specific keys in the user’s HKCU\Software hive.

From the user’s standpoint, it will appear as if all of their Office settings reverted back to the defaults, all because they were randomly load-balanced to a new Terminal Server. . .

Brian suggests a couple of solutions to address this exact problem. Another solution you could use is go to the Epoch Information Center, grab an epoch offset earlier than the other existing servers in the farm, and plug that offset in to the “LatestRegistryKey” value.  See below:


Note: be sure to select decimal when inserting the new value.

Scenario 2 (setting the LatestRegistryKey to the current time):
Imagine that you wanted to add a registry key to each user’s HKEY_CURRENT_USER\Software registry hive. You could write a logon script to do it, or you could add the registry keys to the shadow registry, use the Epoch Information Center to get the current (or future) epoch offset, and plug the epoch offset into the “LatestRegistryKey” (thus emulating the server install mode). The advantage of doing it without a login script is that you don’t have to come up with some logic in your script to figure out if this setting has already been applied, Windows takes care of that part for you.

Anyway, I hope you find the Epoch Information Center useful. The reason I put it online is because I’ve needed to get an epoch offset in the past and I thought some of you might find it useful as well.