Search
More About This Website

All information is provided "AS IS" with no warranties, and confers no rights

Login
Tuesday
Dec012009

Rocky Mountain Tech Trifecta v2.0 Registration Open

Registration has now opened for this years Rocky Mountain Tech Trifecta.  This years all day event will be Feb 27th in Denver, Colorado at  Metro State College Denver.  If your not in Colorado – Don’t stop reading yet – last year we had people attend from 9 different states.  Remember in Colorado it is possible to do this on Saturday and snowboard on Sunday.  For those of you that don’t ski/snowboard I’ve been told there’s good shopping near by as well.

The event will have three tracks for Developers, SQL and IT Pros.

Don’t wait to register – last years event we got real close to having to close off registration.  That was before anyone believed a good event could happen in Denver!  If you think you can make it get registered soon to save your spot.  You can register on the event site here. It’s a free event so you can’t complain about cost even if you have to by a cheap flight to Colorado!  More details can be found on http://www.rmtechtrifecta.com which will keep being updated as we get closer to the event.

Last years event had over 400 people attend covering over 40 topics.  It also had some other interesting facts (stolen from Julies post )

936 slices of pizza eaten

2,550 minutes of technical knowledge shared

4,000 Curious George fruit snack pieces consumed

5,670 oz of soda that was consumed

Oh and my favorite “18 guys named Dave” – maybe this year we can get over 20!

So why are you still here – go register.

Sunday
Nov222009

RIA Services How to Load Only If Authenticated

One of the things I ran into in converting to the PDC release of WCF RIA Services is around error handling in the Domain Data Source.  By default the PDC release handles error handling a little different than prior release in a good way.  Previously, if you encountered an error and didn’t explicitly provide error handling it would silently sweep the error under the carpet.  This worked great for demo applications because you could typically not worry about it.  For real applications though this lead to not knowing what was wrong and the user left wondering why nothing was showing.

Now in the PDC release the error will be raised unless you handle it.  This means that if there are times where you expect to encounter an error you need to make accommodations otherwise an unhandled exception will be raised.  A good example of this is if you have marked your Domain Service as requiring authentication by adding the RequiresAuthentication attribute.  This attribute will ensure that nobody accesses your Domain Service that isn’t authenticated.  The result though is now if your application attempts to make a call to the Domain Service before authentication is completed you will get an error.

Where I hit this was I had a couple of Domain Data Source controls in one application that was set to auto load for the main page.  There’s a few different ways to handle this but I wanted to share on that I was exploring to make it easy to handle this type of situation.  The behavior I want is to have the Domain Data Source auto load but only if the user is authenticated.  I could do this by turning off the auto load and manually calling the load once the user is authenticated.  Another option is I could make Domain Data Source a little more smart to handle these by adding a OnlyIfAuthenticated property that would be checked during the loaded event.

I’m able to do this because I have a class SLRIADomainDataSource that inherits from DomainDataSource and allows me a place to put my custom logic that I want to be on every use of the domain data source in my application.  You could also accomplish this same type of thing using behaviors.  Both approaches are viable they just depend on how you enable the feature to work.  One you add a property, the other you add the markup necessary to attach the behavior to the control.  Behaviors require you to know the class exist where the inherited class the property just shows up in the property designer making it easy to enable.  Behaviors are nice because they allow you to add logic to another control that you don’t own or is sealed.

So the following is the class were working with – notice it inherits from DomainDatasource.  Also we have already hooked into the LoadingData event to allow us get control when the control is about to load data.  The LoadingData method is called prior to the query happening so it’s the ideal place if we want to cancel it being performed.  This is also where you would hook in to handle other pre-query type errors related to query composition.

public class SLRIADomainDataSource : DomainDataSource
{
        public SLRIADomainDataSource()
        {
            this.LoadingData += new EventHandler<LoadingDataEventArgs>
                             (SLRIADomainDataSource_LoadingData);
        }

}

Next, we are going to implement the handler for the LoadingData event.  In this handler we will check OnlyIfAuthenticated to see if we have configured the Domain Data Source to only load if the user is authenticated.  From the WebContextBase class we can access the status of the user to see if they are authenticated yet.  If they aren’t authenticated we simply mark the query canceled in the event arguments.

void SLRIADomainDataSource_LoadingData(object sender, LoadingDataEventArgs e)
{
    if (OnlyIfAuthenticated)
    {
        if (!WebContextBase.Current.Authentication.User.Identity.IsAuthenticated)
        {
            e.Cancel = true;
            return;
        }
    }
}

In case your wondering WebContextBase used to be RiaContextBase in the prior CTP release of RIA Services and has just been renamed.

So with this additional logic the page developer doesn’t have to worry about special code each time to decide if the query should be started.  You also won’t get un-expected errors if the domain data source tries to call the service when the user is not authenticated.   The error handling in the new release of RIA Services is a welcome change to prevent silent errors from happening.  It is also a little more consistent and intuitive in how the errors are presented so you don’t have to hook into as many places.

Wednesday
Nov182009

New Silverlight 4 Book Content

Today we are happy to be able to announce the availability of some Silverlight 4 book content. For existing Silverlight developers looking to get up to speed quickly with the features we are releasing the Silverlight 4 Overview. This is a little over 50 pages of content covering the new Silverlight 4 features.  For the rest of this week using code SL4DaveBlog at checkout you can get the new Silverlight 4 content for only $5 almost half off the normal price.  More details on the book site http://www.silverlightjumpstart.com

For developers that are new to Silverlight but are comfortable with .NET we are releasing a preview of Silverlight 4 Jumpstart. Silverlight 4 Jumpstart content builds on the success of the Silverlight 3 Jumpstart book to offer content focused at the business .NET developer.

Both of these offerings are available today and will continue to evolve with the Silverlight 4 release. These are delivered in an electronic format (PDF) and will continue to be updated with more current releases of Silverlight 4.

 

The following is an excerpt from the Silverlight 4 Overview chapter that is available as part of Silverlight 4 Jumpstart Preview book or as a standalone chapter from SilverlightJumpstart.com. The full overview chapter covers all the major new features of Silverlight 4 to help you get up to speed quickly.

Microsoft has fast tracked Silverlight to be a strong competitor in the global RIA space and squarely positioned itself against competitors like Adobe, Google and Yahoo for production of the finest RIA toolset. The initial battleground was video, but we are now seeing Silverlight has strong potential for building business applications as well. We have tried through the previous chapters to streamline your learning of the current version of Silverlight by focusing on the key areas a business developer needs to know. Now it’s time to talk about the future and what the road ahead looks like for Silverlight.

It had only been about nine months since Silverlight 2 was released in October 2008 that Silverlight 3 hit the street in July 2009. Then, just four months after the release of Silverlight 3 Microsoft released Silverlight 4 Beta at its Professional Developer Conference in November 2009. Each of these releases build on the prior one to add new features while at the same time keeping compatibility to support this fast pace of innovation.

If I had to pick a single theme for the main items that are part of Silverlight 4 I would have to choose “You Asked, Microsoft built it”. I say that because many of the items like Printing or Web Camera/Microphone support for example were some of the highest user prioritized features. You can check that out for yourself at Silverlight.UserVoice.com and while you’re there add or vote on a couple of your requests.

Silverlight 4 is also a major deal because it’s the first release of Silverlight to support .NET 4 CLR (Common Language Runtime). This gives developers access to the latest runtime features that are added to CLR4 including things like dynamic object support.

In addition to the core Silverlight 4 Beta, Microsoft also released corresponding updates to the other tools and products used with Silverlight. The tools for working with Silverlight from within Visual Studio were updated to support the Silverlight 4 Beta. This includes increased designer support to make it easier to develop Silverlight applications without having to leave Visual Studio for a separate tool. A new version of the Silverlight Toolkit was also released that goes along with the Silverlight 4 update. An update was also released for .NET RIA Services which has now been renamed as WCF RIA Services to reflect the fact that it now rides on top of WCF. This is an evolution of the prior .NET RIA Services releases and positions it to leverage WCF as a foundation to build on going forward. In addition to the WCF change a number of additional features such as improved inheritance support were added to WCF RIA Services in this release. Finally, a preview release of Blend for .NET 4 was released to allow it to work with Silverlight 4.
In the rest of this chapter we are going to preview some of these features that you will see in the Silverlight 4 Beta release.

Web Camera / Microphone Support

Silverlight 4 now allows developers to access to the raw audio and video streams on the local machine from applications running both in and out of the browser. Using these capabilities developers can write applications including capture and collaboration using audio and video. This is built-in to the core runtime and no other special downloads are required on each machine. When the audio or video is accessed for the first time by the application the user will be prompted to approve the request. This ensures that audio and video is never accessed without the user’s knowledge preventing applications that capture silently in the background. The following is an example of the prompt the user sees when the application requests access to the devices.

VideoAudoPrompt

You will notice in the above image the site name is displayed. This is another safeguard to ensure the user knows which site is requesting access to the devices. Access is granted to just this application and only for this session of the application. Currently there is no option to persist the user’s approval to avoid re-prompting each time the application is run. Additionally, it’s all or nothing; you don’t get to choose video or microphone. It’s a combined approval.

Users with multiple devices can select the devices they want to be the default devices using the properties on the Silverlight plug-in. This can be selected by right-clicking on a Silverlight application and going to the Webcam/Mic tab.

The following is an example of what you will see on that tab.choosedefaultmic

Developers can get access to the chosen devices using the CaptureDeviceConfiguration class. Using this class you can call the GetDefaultAudioCaptureDevice or GetDefaultVideoCaptureDevice methods to retrieve the users selected defaults. The class also has GetAvailableAudioCaptureDevices and GetAvailableVideoCaptureDevices methods that allow you to enumerate the available devices if you want more control of choosing a device besides the default.

Prior to using the devices you must request access to the device by calling the RequestDeviceAccess() method from the CaptureDeviceConfiguration class. When this method is called it is responsible for showing the user approval dialog we saw earlier. This method must be called from a user initiated event handler like the event handler for a button click event. If you call it at other times it will either not do anything or produce an error. Using the AllowedDeviceAccess property you can query if access has already been granted to the device.

The quickest way to get started using the video is to attach the capture from the device to a VideoBrush and then use the brush to paint the background of a border. The following XAML sets up the button to trigger the capture and a border that we will paint with a video brush.

<StackPanel>

<Button x:Name="btnStartvideo" Click="btnStartvideo_Click"

Content="Start Video"></Button>

<Border x:Name="borderVideo" Height="200" Width="200"></Border>

</StackPanel>

Next, the following private method TurnOnVideo method is called from the handler for the click event on the button. This satisfies the requirement to be user initiated.

private void TurnOnVideo()

{

VideoCaptureDevice videoCap =

CaptureDeviceConfiguration.GetDefaultVideoCaptureDevice();

AudioCaptureDevice audioCap =

CaptureDeviceConfiguration.GetDefaultAudioCaptureDevice();

CaptureSource capsource = new CaptureSource();

capsource.AudioCaptureDevice = audioCap;

capsource.VideoCaptureDevice = videoCap;

if (CaptureDeviceConfiguration.AllowedDeviceAccess

|| CaptureDeviceConfiguration.RequestDeviceAccess())

{

capsource.Start();

VideoBrush vidBrush = new VideoBrush();

vidBrush.SetSource(capsource);

borderVideo.Background = vidBrush;

}

}

As you can see in the code above, default audio and video devices are retrieved and assigned to a CaptureSource. Access to the devices is then checked and requested if not already approved.

If access is granted the Start() method on the CaptureSource is invoked to begin capturing audio and video. Finally, the VideoBrush source is set to the CaptureSource instance and the background on the border is set to the VideoBrush.

Overtime we will probably see some very interesting applications of the audio and video support. One example that we put together was using it with Microsoft Dynamics CRM. In this example application a membership application was simulated that associated members with pictures and stored the pictures in a database. Think of a place similar to Costco, Sam’s Club or your local gym that snaps your photo for their records.

In the following image you can see how a tab has been added to the Contact form using the CRM customization capabilities.

 

CRMCap1

A Silverlight 4 application is then hosted inside that tab that will provide the user experience for capturing the images. When the Start Camera button is clicked the user will be prompted to approve the access and the video feed will begin as you can see below.

crmcap3

The video feed will keep showing the live image updated from the web cam until stopped. The Capture button on the above application allows the user to capture one of the image frames from the capture source. The AsyncCaptureImage(..) method on the CaptureSource class allows you to request that a frame be captured and your callback invoked. The callback is then invoked and passed a WriteableBitmap representing the captured frame.

crmcap4

This image can then be saved back to the Dynamics CRM server and associated with the record being viewed.

In the above example we looked at how you could use the video capabilities to capture a static image. More advanced applications are also possible for things like collaboration by showing the real time audio and video feed of multiple users.

You have been reading about one of the many new and exciting features of Silverlight 4 that are covered in the complete overview chapter. Visit SilverlightJumpstart.com today to access the full chapter.

Sunday
Oct112009

Cloud Computing Meets the Tax Man

Friday I got a nice surprise e-mail from the Azure team - “Action Required : Migrating Applications from the “USA – Northwest” – it continued to let me know that the northwest region will no longer be supported.  It suggested that I delete my project and re-create it using the south west region.  Ideally I would prefer they just handle it, but since Azure is still in CTP mode I will cut them a little slack. 

Looking into this a little more I found some more clear details on the Azure team blog here.  It explains “Due to a change in local tax laws, we’ve decided to migrate Windows Azure applications out of our northwest data center“  

You might notice the blog post is dated August – not sure if I saw that and thought it didn’t apply or never read it.  Either way, from my perspective most the time I don’t care if my data is in the South / North, East, West  - I do tend to care which country it’s in but besides that I just want it to be handled behind the scenes.  There are however good reasons to allow some control over the specific location when the application and / or business needs dictate.  It would be great to see Azure provide some tools to allow you to migrate data as needed between the regions.

As more applications move to have data hosted in the cloud we can only expect more skirmishes to occur with the taxing authorities as they wrestle to figure out a balance between getting their fair share of taxes and pushing businesses away.  This article on the Data Center Knowledge site elaborates a little more on the specifics of the Microsoft problem with Azure here.   I suspect we also haven’t seen the end of location matters for other legal jurisdictions of data as well but that’s a topic for another day.

Part of the appeal of cloud computing is it’s ability to abstract you from some of the low level issues of hosting data and computing power.  Power, cooling, hardware expenses, taxes and more are all of the package that makes this type of arrangement interesting.  It stands to reason that companies like Microsoft will make location decisions to where there is a strategic advantage for a key component of their expenses in providing the service.  In fact,  I think that will be required for any of these companies to be competitive.  I also think that making that as non impactful to the end user is also a core requirement of offering the service as well.

Wednesday
Sep232009

Lessons learned from my iPhone experiment

About once a year I need a new phone. Not because the old one dies, but I get bored and start looking for something new.  This year was no different as I approached my one year of replacing my Treo 750 with a Treo Pro I was in search of a new device.  For months, I had ignored everyone from friends to my Mom getting an iPhone.  Finally a few weeks ago about 15 minutes before the AT&T store closed I decided to grab one and give it a try. 

I lasted almost 3 weeks before I terminated the experiment due to a few different issues.  But before I did, I learned some interesting things.

I didn’t mind paying a small amount for useful apps – The under $3 applications make it so easy to not ponder too much about the decision to try it. The integrated application store is clearly what makes this so easy. I ponder, if micro transactions were easier all across the web would people be willing to part with their money more often?

Free applications are a powerful business tool – Just before walking into a restaurant the other night I made a reservation with Open Table’s application and just as we walked in it popped on the managers screen. News papers, Chipotle, Fandango there all there. Like the web made having a web page a requirement for a business the iPhone is having a similar effect in the mobile world. Today it’s the iPhone, but I suspect that’s just the catalyst that is helping people get comfortable with that type of interaction.

Not all phones have signal equality - It’s not new that different phones will have different reception patterns but I have never seen a phone in this price range be so inferior. Last weekend we had a chance to drive from Colorado to Utah. On multiple occasions my iPhone was the only AT&T phone in the car that was without signal. I travel a fair amount, and I have always enjoyed the fact that I arrive and my phone just works. I’ve had my other phones work on a boat, in remote areas and in other countries and never had I noticed such a difference to other AT&T phones as I have with the iPhone.

AT&T has a mini monopoly - I can only imagine the executive meeting where they decided to stick it to customers that wanted to jump on the iPhone parade. I knew about the data plan requirements, I was used to that. I also knew that they didn’t allow tethering. But I didn’t know that the international data plan I was paying $60 a month for jumped to $150 for no other reason than I had an iPhone. This is one aspect of the iPhone I won’t miss and it’s clear AT&T is milking every $ they can from the early iPhone adopters.

Google maps tracking was very jerky – Visiting a number of cities I don’t know well I became a big fan of the Windows Live search application and using the Track GPS feature that would follow you on a map and give you close to real time navigation. The only quirk I found on my Treo Pro with this was it was sometimes slow to acquire the GPS signal. iPhone has the Google maps application that is ok, but in practice I found it to be very jerky when it was trying to track your progress. So much so if you tried to use it for navigation you might not know you passed a turn.

Apple’s figured out application interaction – Sometimes (when it was working) I would use the iPhone to clear stuff out of my e-mail inbox. With just a swipe of my finger I could quickly delete an e-mail, bring it up, using two fingers to make the text larger. The multi select and delete had to be one of my favorite features. I also really liked how you could easily resize e-mails and web pages it made the mobile device really feel usable.

Connected Everywhere? Not yet! – Apple may have great insight, but iPhone may be ahead of its time assuming that users will be connected everywhere. I was shocked on my first airplane trip when I started to delete a few e-mails only to get a cryptic error message saying it couldn’t move the message. It turns out that “offline” isn’t in apples vocabulary. I believe we will get there eventually, but today occasionally connected is still a necessity.

Physical Keyboard vs. More Screen Real Estate – This is a tough one for me, I really liked the large screen area the iPhone had. In fact today being back on my Treo Pro I kept thinking how small the screen space is because of the keyboard. That said, there’s nothing like the feel of real keys when typing a message no matter how short the message is. The iPhone soft keyboard grew on me and I got better at it, but it still felt really clumsy compared to a real keyboard. Also, the correction algorithms that try to suggest spelling and word completion are very poor on the iPhone compared to what I’m used to on my Treo. At the end of the day though, I would trade physical keyboard for more screen space.

What’s with only supporting one Exchange Account - This isn’t just an Apple thing, Windows Mobile does it too. It’s great that devices are starting to support multiple e-mail accounts but there needs to be support for multiple Exchange accounts as well. It’s clear that Exchange e-mail support was an afterthought for the iPhone. Beyond just the multiple account support, I found the error messages cryptic, and 3 or 4 times I had to completely reset my mail account to get things working again. This seemed a little better in the recent 3.1 upgrade but not much.

It was small but made a great reader - Having the larger screen I was able to use the iPhone for reading e-mail, RSS feeds, FaceBook, News Papers and books. In fact, I really liked the Amazon Kindle reader. While some like the non-back light of the Kindle, for me the iPhone or iTouch represents just the right size for a carry anywhere device for reading stuff.

So between the coverage issues, the e-mail issues and the outrageous international data pricing I decided I could live without for now. Don’t get me wrong, I really liked the phone, it seemed to make me more productive. But something about having to keep resetting my e-mail and sometimes it causing the phone to not take inbound calls got on my nerves. So for now I’m a free agent, on the roam for a new phone. Maybe one of the new Windows mobile phones releasing in October will be interesting or maybe I will go through withdrawals and live with the iPhone quirks – Stay tuned!

Thursday
Sep102009

RIA Services – Finding the InnerException

One of the challenges I’ve run into with using RIA services is that sometimes you get back a message saying check InnerException only to find it is null. 

 

In the following example, I had deleted some data and submitted changes.  I set a break point on the server side in the delete method so I knew that it’s getting called and ran ok.  Back on the client side I would see an exception when checking the results of the submit operation as you can see in the following example.  The catch is notice that the message tells me to check the InnerException which is null!

image

 

While I clearly believe this is a bug and have reported it, I wanted to share how I track down what the real error is.  The quickest path I have found is to override the PersistChangeSet method on the server DomainService.  As you can see in the following example all I do is capture the error and I can set a break point.  You can now quickly determine what the error is.  If you want, this is also where you could “fixup” the exception to pass back a more meaningful message.

 

Update: From Nikhil - He suggests overriding Submit which is a good sugestion, the idea is you need to catch the error server side in order to know the "why" so there's two places you can hook into if you want.

 

image

Monday
Aug242009

Kicking the tires on SQL Azure

Last week I got my token to activate my SQL Azure service and took a few minutes to take it for a test drive.  If you haven’t heard about SQL Azure it’s the re-launch of SQL Data Services (SDS).  SDS used to be a much more restrictive model that really didn’t have any hope for sharing code between a traditional SQL Server and the cloud.  At Mix09 it was announced that SDS was going to go through some major changes for the best and move towards allowing more fuller relational and a powerful subset of TSQL support.  Recently in July, SDS was renamed to SQL Azure to better align the name.

Like other Azure services once I got the token, I just went to the portal and there’s now a SQL Azure section – after registering my token it allowed me to create a SQL Azure Instance and then I proceeded to create a new database via the web interface – this all happened really fast and painless. Now with a new database, I set out to figure out how to connect using SQL Studio. From the portal you can get your connection information but there’s a few quirks with the CTP you have to deal with.  Zach owns blog post on using SQL Azure is the best place to start to figure out how to connect.  Like I said it’s a little quirky, and you have to be patient with the timeouts that kick you off if you leave it idle, but the fact that I can use SQL Studio to query the data in the cloud was a huge plus!

Next, I wanted to move some table definitions from my local SQL server to my SQL Azure database.  There’s not a copy /upload database option yet – lets hope that gets added but in the mean time I found a couple of things that worked.  First, I tried the create script from within SQL Studio.  Using that approach I copied it into my query window for the new database and gave it a try.  You have to tweak the SQL some because things like Using database isn’t supported in the CTP  There’s some other keywords that aren’t supported either I just kept deleting them till I got my table create to run!

After some experimenting I actually found using Red Gates compare was the best approach because it would script multiple tables at once and the SQL seemed to be cleaner for what SQL Azure wanted (cleaner, not perfect I still had to do some tweaking).  The way I tried this was to create another empty DB locally that I would do the compare against to create my script.  That also worked as I modified and made changes to the tables I would just do the compare to that local copy and run the change script (slightly tweaked) on both the local copy and my SQL Azure db.  Who knows, maybe Red Gate will come out with a version specifically for doing this as I bet they could get something out quicker than MS tweak SQL Studio.

The database I moved to SQL Azure had a few tables that supported an application that used Entity Framework.  This is the part where I emphasize the improvements between the prior SDS and SQL Azure.  I simply took my connection string, pointed it to the cloud  and I was off and running.  Now I’m not saying you will find SQL in the cloud a seamless transition for every application and clearly there are some inherit things that are different with SQL Azure that you need to learn.  But compared to the prior SDS, I’m impressed and like where we are heading.  After almost a week of running this small database in the cloud with an application I use daily, so far so good!

Speaking of learning, there’s a lot more info on the SQL Azure dev center on MSDN you can find that here.

Page 1 ... 3 4 5 6 7 ... 32 Next 7 Entries »