PCoIP and USB Mic misbehaving with “Follow-Me” desktop

 

I’ve been working on “Follow-me” desktop solutions lately, especially VMware View. Working through different workflows and use cases, I ran into a pretty vexing peripheral issue. As it stands, the PCoIP protocol has an issue with how it handles bi-directional audio through a USB microphone.

One of the new features of recent PCoIP releases has been support for isochronous USB devices to be connected. This type of device has certain bandwidth guarantee associated, and is given privileged status essentially. Things like microphones! You have to have a great connection or your voice input comes out sounds all garbled and scratchy when put over the network (connecting your physical desktop to your virtual desktop). This actually works quite well and I have to give PCoIP props for how well it performs. A good guide to the basics of getting things rolling is found here.

However, one particular use case makes apparent a big problem right now with PCoIP and USB microphones, the “Follow-me” desktop. Normal behavior for USB redirection goes along the lines of….

  1. Disconnect device from host
  2. Reconnect device to target virtual desktop

This is the Virtual USB hub included with the View client making this happen. When you disconnect your physical computer from your View desktop, the following step is supposed to happen

  1. Disconnect device from target virtual desktop
  2. Reconnect device to host

This is all because only one system can “own” a USB device at once. However the last step is not happening for USB microphones when connecting via PCoIP. The process works correctly for RDP connections oddly enough, however this will not work correctly with some commonly used apps for a USB mic, like Dragon Naturally Speaking.

What you will observe follows this pattern….

  1. First connection from a “fresh” host PC to a Virtual desktop ends with the microphone working.
  2. The user logs off of their View desktop
  3. Any user that then tries to logon from that desktop will be unable to connect to the microphone. You will see an error similar to “Cannot connect <device name> It may be in use by another application”.
  4. If you unplug the USB microphone and plug it back in to the physical PC, you will see the USB composite device in a disabled state in device manager.

There is a VMware KB regarding this issue here that describes in great detail what error messages you might see, and describes their current official workaround.

kb.vmware.com/kb/102687

The core of the solution, if you can call it that right now, is to remove the disabled USB Composite device corresponding to your mic, from the device manager, while it is plugged in and after it has “failed” and will no longer connect to View desktops. This does reset the device so that it can successfully work with another virtual desktop again, but far too detailed for an average user to do on a near constant basis.

If you want to script this to take a lot of the pain out, Microsoft actually has a nifty command line utility that lets you manipulate the device manager. DevCon is the name, and it can be a lot of fun….but can have….ahh…unfortunate side effects if you aren’t careful.

http://support.microsoft.com/kb/311272

Regardless, if used correctly, you can fire off DevCon to remove the specific VID/PID of the USB mic from the host PC(specifically the USB composite device). However the end users would still need to physically unplug the mic and plug it back in, before the device would be fully reset and ready to connect to another virtual session.

Feelers are out to Teradici and VMware to see if there is a more elegant solution to be found. It sure would be nice to be able to treat USB microphones as equal citizens in the mobile virtual world!

Trend Deep Security Manager and Windows XP Guests

 

I’ve been putting together a Trend Micro DSM implementation. DSM connects to appliances on each ESX host, which communicate through vShield endpoints on the VM guests, to provide “agentless” anti-virus/malware ( I guess we arent supposed to count the vShield endpoint install). It’s a pretty nifty system in that you can really lighten the load that comes from running anti-virus/malware for your VDI environments, by moving the AV scan load from the individual guests to the host directly.

One thing that is not clearly called out and, of course, turns out to be crucial, is the SCSI compatibility at the endpoint. Since a lot of view deployments are still Windows XP based (usually 32 bit), don’t get caught in the trap of building up a perfect slimmed down, disk aligned gorgeous image, and then realizing you need to change the SCSI driver. Trend Micro DSM currently only supports LSI Logic SCSI drivers and VMWare Para-virtualized, not Buslogic and not IDE drivers. By default with ESX 4.1, Win XP 32bit gets IDE disks and Buslogic SCSI drivers if any SCSI device is added. Changing a SCSI driver, while doable in some circumstances, is never fun and usually ends in a rebuild of the OS.

Use a SCSI disk for your XP View parent image and make sure the driver is LSI Logic Parallel. Otherwise your carefully crafted DSM environment will report all of it’s subsystems and appliances working perfectly, and your endpoint will show a “Filter Driver Offline” error.

A question of continuity

So stick with me for a moment I’m an applications oriented guy, unlike a lot of my storage oriented brethren here at Varrow…But I want to make the argument that the question of how do we preserve our information, is one of the oldest questions on the planet.

Almost 30,000 years ago, our ancestors were trying to preserve knowledge of their way of life. They delved into the darkness under the mountains, to the most secure place they could think of. They left later generations some amazing snapshots of their life.

And so it went, the great experiment. We devised constant new methods to store our race’s DNA, our history and culture. The constant struggle to retain information. Egyptian papyrus scrolls reveal to us now, 4000 years later, a complex math and science body of knowledge, with great chunks smudged or torn out.

To think in the dark ages, there were monks whose sole task on this earth, was to lovingly copy the contents of one book into another. This was considered holy work, to preserve knowledge in a dark time.

Somehow we survived, and now that’s all old school. Everything is digital now. We are all set right? Well maybe not. Optical and magnetic media has a pretty short shelf life. CD/DVD/DLT tape lasts 5-30 years if you are very lucky. My sons’ copy of Mario racing lasted 1 year. But the times they are a-changing.

Which brings me to the present where we as a race are finally solving one of the most ancient problems, how do we preserve our information? Storage companies are finally bridging the gap to allow for geographically dispersed storage movement, without interruption of service. Just as the internet provided a common bus for communication between storage and processing pools, technology such as EMC’s vPlex are enabling this kind of federation at the storage level. All without interruption.

Eventually, as this technology embraces all storage platforms, we are going to end up with storage “networks” that cover the entire country. You will be able to move your entire data center in real time from one state to another.

This is just the beginning of what is possible. As bandwidth becomes ever more ubiquitous, your storage array becomes just as flexible as your blade servers in moving around live resources.  Information is constantly refreshed. Even at the consumer level, the availability of the “public” cloud allows for the preservation of your family album.

Your critical data that your business lives and dies off of is guaranteed a steady cycle of constant refresh, without having to deal with all the hassle of managing the risk of a potential massive service interruption. It becomes a non-issue. Constant data refresh just HAPPENS. For managers of large implementations, even with ITIL guidelines, once the new environment is tested and certified, the actual migration is yawn inducing. Not a terrifying journey into the depths of what caffeine can do to a person at 4 in the morning.

This is the kind of future we are building towards. No more “my family pictures were lost in the fire”. No more “a burst water main destroyed our DC”. We are at the point of creating a continually renewing global record of mankind. One that can withstand the usual random chaos we have all seen so closely. It isn’t a record etched in titanium and buried in the desert, or a painting buried 1000 feet below the earth in a cave in the south of france. What we are working towards, and are seeing the first sprouts of success now with EMC vPlex, is a vision of truly federated storage, eventually across vendors. Where entire virtual infrastructure’s can move from one storage platform to another in real time.

And as our scientific research delves into how we can store data at ever shrinking levels, and as we sit on the verge of a sea change in how active data is stored, the problem our ancestors faced of “how do I preserve this wicked cool sabertooth drawing”, is reaching its ultimate conclusion.  We are achieving a platform for which any piece of data can be maintained in an active (accessible) and renewable state.

Pretty cool time to be in IT!!

An End to a Cycle

Since the days of the above, the way we interact with computers has moved kind of like the tide. Years after this 30 ton monster got done revolutionizing the science of a postwar US, we moved into a world dominated by big iron. At work was a giant IBM mainframe and at home was…well…nothing. Come on! We’re talking the 50’s-70’s. Sweet headphones was your best bet.

Our computing power was centralized. This is the first age of what we now call “cloud” computing. Every bit of data in one massive repository. A pure era. Perfection. Even the programming code was constructed by pretty much straight up scientists. This was the garden of eden for computing for the human race.

And then……this……

and the great exodus began. Eden was shattered. The march to the edge. Because you know what? That’s freaking cool.

This commenced, and then proceeded nearly uninterrupted, from this point to about 5 years ago. Computing resources steadily flowed more and more into the end point rather than the center.

We may suppose someone asking:

“Why? Why would you move data so far apart? We were in a pure state with all the info nice and snug together. Why would we want to leave the nest?”

To which everyone on earth would respond “Do you see how sexy those pictures above are?!?!”

A massive explosion with what you could do with a computer occurred. Suddenly the accounting department could get a months’ worth of work done in a week, and at home, kids were driving parents crazy with requests for Atari’s and Nintendo’s and Playstations. Some people were writing little logo scripts for a tiny turtle, but we’re not talking about that!

For the decades that followed, more and more data was processed on the ever shrinking box under your desk. The sad carcasses of the central servers stood dumbly in the center, relegated to shuffling around emails and accounting database query responses. Sometimes, when one got really lucky, they got to run a batch job!! Forgotten hulks, in heavily air-conditioned rooms (if they were lucky).

And then…almost completely unobserved by humanity, in 1972 some of the brightest humans on the planet, opted for <python>something, completely different.</python>

ARPANET!!!

(The little internet that could)

And with the creation of this little wonder and some clever conversation protocols, the flow to the edge reversed. Suddenly the horse-blinders were off, and the sound of screeching modem connection strings were ringing.

This slowly, inexorably, brings us back to today. The giant systems back in the data center are suddenly waking up and running applications again. As our science progresses, ever smaller devices are capable of ever increasing feats. All aided by nearly instantaneous communication completely leveling the playing field. We are evolving to the stage where it doesn’t matter where you do your computing. The information simply flows where it needs to be.

So I put before you, that we stand on the precipice of a new age. Suddenly the way you use computers at home is no different from the way you use them at work. Just as you pop onto an app store to grab a useful little widget for your phone, or a hilarious video on youtube, you can pop into your virtual desktop at work, from anywhere in the world. In a moment you can grab whatever app you need in a seamless environment, always at your bidding, anywhere you are.

Suddenly the walls are falling away, and in their wake lies the opportunity to reinvent and reimagine how things should work. The subtle earthquake is already under our feet, but it’s there.

We are coming to a state where the cycle of back and forth between central computing and decentralized computing truly is forming a “cloud” of computing resources that we can draw from, whether at home or at work.

It is one strange case of marketing-speak actually matching reality. We truly are developing in a way that resembles a cloud. Simply a secure, and normally controlled cloud.

This blog is meant to be part exposition of the interesting and the odd that I (and hopefully others), come across as we all begin really adapting to the new environment.

Full disclosure mode engage! I’m a systems engineer type. I’ve worked with what you could call “converged” systems for years now. I see how things are changing in the workplace at nearly the same pace that they change in our homes. Our tools are evolving at an ever astonishing pace.

Suddenly we can allow a single person to maintain and grow a massive number of servers. With the right software it can be done with practically no effort. At the same time, in our homes if we want to hear the latest hot track, or suddenly want to listen again to an old classic, that is a mere 1 minute and 99 cents away at any time. In the sweet living room stereo if you want it.

We now can say, “Hey highly trained college grad! Instead of the boring old company PC, we’ll just give you X amount of money towards whatever hot item you want!”. And in the background a normally stressed out security admin is peacefully snoring in the breakroom, as all the applications and data are delivered in a secured encrypted pocket from which not even light could escape.

There are right now, certain vendors that are rapidly blurring this line, giving companies and consumers, a flexibility and reliability that simply never existed before. There are companies whose sole purpose and passion, is to help businesses be the best at harnessing this change, and the next generation. I happen to work for one of these companies, but I have no doubt, they are everywhere throughout the world.

It’s a truly interesting time that will present many opportunities to those that can see it. Especially in an environment where you need to bring in the best and the brightest, those companies that can make the business life as easy as the home life will thrive.

The cycle is over. The server and the client fought it to a draw and everybody won. There are no more rules about where your computing happens, there is only opportunity in deciding how it happens. It’s just like the Oklahoma land rush. It’s a wide open field to figure out how we work in the future.

Game on