Configuring RDP access to Windows Azure VMs

No Comments »

As part of the launch of Windows Azure SDK 1.3 Microsoft has provided us with the possibility of connecting to our Windows Azure VMs using Remote Desktop.

Using Windows Azure Tools for Visual Studio 2010 it is straight forward to configure your application to allow remote desktop access to the VMs:

  • Right-click the service in VS2010 and choose “Publish”
  • Click the link “Configure Remote Desktop Connections…”
  • Check “Enable connections for all roles” and choose or create a certificate to encrypt the password
  • Enter username, password and the user’s expiry date and click “OK”.
  • Perform the rest of the deployment as usual (make sure to upload the certificate, if you chose to create a new one)

If all you want is remote desktop access to your Azure instances this will do. If you want to understand what is going on and how to manually configure your application for remote desktop access then read on.

Manually configuring the application for remote desktop access is a two-part process:

  • The application must be configured to allow RDP connections on port 3389
  • You need to create an encrypted password and enable the Fabric Controller to access the certificate used for the encryption.

Configuring the application

In order for the Windows Azure load balancer to allow inbound RDP-connections, you have to enable this in your service definition. Open the ServiceDefinition.csdef file and locate the WebRole-/WorkerRole sections describing the roles for which you want to allow remote desktop access. Import the module “RemoteAccess” in each section:

<Import moduleName="RemoteAccess" />

 

Because of this import the Fabric Controller will set up internal TCP endpoints on port 3389 for each role.

In addition to these internal endpoints, exactly one role needs to define an input endpoint to which the load balancer can route inbound RDP connections. When you access an instance via RDP, this role will forward the connection to the instance to which you intend to connect.

Thus, you have to add the following to exactly one of the sections in ServiceDefinition.csdef:

<Import moduleName="RemoteForwarder"/>

 

Assuming that we started out with a service containing one web role and one worker role, our service definition now looks like this:

<?xml version="1.0" encoding="utf-8"?>

<ServiceDefinition name="RDService" xmlns="http://schemas.microsoft.com/

ServiceHosting/2008/10/ServiceDefinition">

  <WebRole name="WebRole1">

    <Sites>

      <Site name="Web">

        <Bindings>

          <Binding name="Endpoint1" endpointName="Endpoint1" />

        </Bindings>

      </Site>

    </Sites>

  <Endpoints>

      <InputEndpoint name="Endpoint1" protocol="http" port="80" />

    </Endpoints>

    <Imports>

      <Import moduleName="RemoteAccess"/>

      <Import moduleName="Diagnostics" />

    </Imports>

  </WebRole>

  <WorkerRole name="WorkerRole1">

    <Imports>

      <Import moduleName="RemoteAccess"/>

      <Import moduleName="RemoteForwarder"/>

      <Import moduleName="Diagnostics" />

    </Imports>

  </WorkerRole>

</ServiceDefinition>

If you have used VS2010 to edit ServiceDefinition.csdef you will see that a number of settings have been added to each role in the service configuration file, ServiceConfiguration.cscfg (if you haven’t used VS2010 you will have to add them yourself):

<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="" />
<Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="" />

Moreover, for the role acting as Remote Forwarder VS2010 has added:

<Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="" />

 

and, finally, a new element has been added to the Certificates section:

<Certificates>
   <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption"
thumbprint="" thumbprintAlgorithm="sha1" />
</Certificates>

The first four settings are obviously there to allow the Fabric Controller’en to configure each VM when the application is deployed. At this point, we can fill in all these fields with the exception of the AccountEncryptedPassword field. In my case, I enter “true”, “rune” and “2010-11-06 00:00:00Z” (see ‘Setting Up A Remote Desktop Connection For A Role’ for information on the date/time format). Moreover, for WorkerRole1  I put “true” for the “Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled” field.

Specifying the password

We need to choose a password for our remote desktop user, and for security reasons it needs to be encrypted before we enter it into ServiceConfiguration.cscfg. Thus, we need to perform the following tasks:

  • Create a X509 certificate
  • Encrypt a password using this the certificate 
  • Put the encrypted password in ServiceConfiguration.cscfg
  • Provide the certificate to the Fabric Controller
  • Specify the certificate’s unique name (thumbprint) in ServiceConfiguration.cscsf

To create a X509 certificate run a Visual Studio Command Prompt  as administrator and execute the following command:

makecert –sky exchange -r -n "CN=AzureRD" -pe -a sha1 -len 2048 -ss My "AzureRD.cer"

(you can run makecert –! to see a descriptions of the flags used and read general guidelines for creating a certificate here).

This will create a certificate and store it in the file AzureRD.cer as well as install it into the local certificate store. You can verify this using PowerShell:

PS C:\Users\Rune Ibsen> cd cert:\CurrentUser\My

PS cert:\CurrentUser\My> dir

    Directory: Microsoft.PowerShell.Security\Certificate::CurrentUser\My

Thumbprint Subject
———- ——-
FF89E4AEFF26E891CAD29F9C59F6E5F9050B2337 Windows Azure Tools
55825F4612F9EDCAB073DE98AA5419FA710A2973 AzureRD

Notice that this listing contains the certificates’ thumbprints. You will need your certificate’s thumbprint shortly.

Next, we need to actually use the certificate to encrypt the password. Fire up PowerShell and run the following commands:

[Reflection.Assembly]::LoadWithPartialName("System.Security")
$pass = [Text.Encoding]::UTF8.GetBytes("yourpassword")
$content = new-object Security.Cryptography.Pkcs.ContentInfo –argumentList (,$pass)
$env = new-object Security.Cryptography.Pkcs.EnvelopedCms $content
$env.Encrypt((new-object System.Security.Cryptography.Pkcs.CmsRecipient(gi cert:\CurrentUser\My\55825F4612F9EDCAB073DE98AA5419FA710A2973)))
[Convert]::ToBase64String($env.Encode())

The encoded string resulting from the last command is what you need to put in the fields AccountEncryptedPassword in ServiceConfiguration.cscfg. Moreover, put the certificate’s thumbprint as the value for the thumbprint attribute in ServiceConfiguration.cscfg.

In order for the Fabric Controller to configure the virtual machines with the chosen password, the Fabric Controller needs to have access to the certificate that was used to encrypt the password. In particular, you will have to provide a personal information exchange (PFX) certificate.

To create a PFX certificate go to a command prompt on your local machine and run

certmgr.msc

This will bring up the certificate manager. Locate the newly created certificate, right-click it and choose “All Tasks –> Export…”. Follow the wizard, indicating along the way that you want to export the private key. Once complete, this process will create a .pfx file.

Now go to the Windows Azure portal and create a new hosted service. Then select the Certificates-folder and click the Add Certificate in the upper left corner. Upload the .pfx file you just created.

Next, deploy the application to the hosted service just created and, once deployment is complete, connect to a VM using RDP. Presto! You see the familiar Windows 2008 desktop!

I think it is worth noticing that you can enable and disable remote desktop access on a per role basis at runtime, that is, without having to do a redeploy.

Also notice that if you enumerate the instance endpoints for your roles, you will see that they now have an instance endpoint named “Microsoft.WindowsAzure.Plugins.RemoteAccess.Rdp”. I did this by connecting to an instance via RDP and enumerating the endpoints using Powershell:

image

If you inspect the endpoints closer, you will see that they are listening on port 3389.

References


Windows Azure Acceleration Workshop

No Comments »

I will be teaching a 3-day training session on the Windows Azure Platform starting on the 30th of November.

The content will be a bespoke mix of Windows Azure subjects and alternate between presentations and hands-on labs, providing the participants ample opportunity to get their hands dirty in Microsoft’s training center.

This particular event is by invitation only, but if you are interested in a similar training course or just an introductory session on Cloud Computing in general and Windows Azure in particular, please contact my company, Copenhagen Software.


Architecting for on-premise as well as Azure hosting

2 Comments »

If you’re an ISV and you’re contemplating whether you should migrate your product to Windows Azure, you may be asking yourself if you can have your cake and eat it too, that is, have the exact same code base deployed with customers running on-premise as well as with customers running in the cloud. In this blog post I will offer some technical guidance on this issue.

These are my assumptions:

  1. You want to offer your software as Software as a Service, that is, as a hosted service running on Windows Azure
  2. Each customer will want to have a dedicated virtual machine for running the application (that is, I am not going to go into multi-tenancy here)

I am not making any assumptions on whether your application can currently be scaled horizontally. If it can’t, moving it to Azure will not change this. The Azure load balancers use a round-robin algorithm and have no support for sticky sessions of out the box, so making the application scale will most likely be a non-trivial exercise.

Isolating dependencies

The key to having an application be able to run both on-premise and in Windows Azure is to isolate the application’s dependencies on the hosting environment, hide these dependencies behind suitable abstractions and then provide implementations of these abstractions tailored at each hosting environment.

So, the first step in taking your application to the cloud should be to identify the dependencies for which we will need to provide abstractions. Obviously, we only need to consider the parts of the application’s environment which differ between Windows Azure and an on-premise environment. Typical examples include:

  • Database (SQL Server vs. SQL Azure)
  • Shared file systems

Thus, you need to re-architect your application from something that looks like this:

raw_architecture

to something that looks more like this:

on_premise_architecture

If you are in luck, you already have an abstraction which somewhat isolates the rest of your application from the particular database implementation. This would be the case if you are using an ORM like Entity Framework, Nhibernate or something similar. Isolating other dependencies may require more work.

Taking it to the cloud

At this point, you have an application with a nice encapsulation of external dependencies. Even though the application is only able to run on-premise at this point, the architecture has already been improved.  The next step is obviously to provide implementations of your abstractions suitable for running in the cloud. This is the fun part: If you have identified the right abstractions, you can go nuts in cloud technology, using massively scalable storage, asynchronous communication mechanisms, CDNs etc. This process should give you an application capable of running on Windows Azure:

cloud_architecture

Configuring dependencies

The code base now contains everything we need for an on-premise installation as well as for a cloud deployment. However, to truly decouple the application from its dependencies, the set of dependency encapsulations to use in a given installation should be easily configurable. To this end, use your favorite dependency injection tool. Unless you decide to roll your own tool, the tool you choose will most likely support multiple methods of configuration and you can choose whatever method you prefer.

If you want to get really fancy, you may even choose to have the application configure its dependencies on its own on startup. The application can use the Windows Azure API to tell it whether it is running on Windows Azure. The information is available through the Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment class, which has a static property called IsAvailable:

IsAvailable

I usually hide the environment information provider behind a static gateway pattern.

This was a quick run down of one path towards the cloud. Obviously, there is much more to be said, especially with respect to the ‘taking it to the cloud’-step. I’ll save that for another day.


Packaging a Web Site for Windows Azure

1 Comment »

I am spending a few days participating in the “Bringing Composite C1 to Windows Azure” workshop this week.

I didn’t know Composite C1 beforehand, but so far it has been a pleasure to get to know it.

The only major hiccup we encountered on the first day was that the Composite C1 application is not a Web Application Project in the Visual Studio sense, but is instead a Web Site.

Windows Azure web roles are per default Web Application Projects, so we set out to convert Composite C1 to a Web Application Project while discussing if it is possible to deploy a Web Site to Windows Azure.

Since Windows Azure is just Windows 2008 with IIS 7.0 I figured it should be possible to run a Web Site on Windows Azure, but whether we could get the management services to deploy the Web Site in the first place was another matter.

Coincidentally, Steve Marx recently wrote a blog post on manually packaging up an application for deployment to Windows Azure, so in this blog post I will attempt to deploy a Web Site to Windows Azure using a manual packaging approach. I will be using a generic web site instead of Composite C1 since using Composite C1 would probably cause some unrelated problems to surface.

So, I start out by creating a new Web Site:

default_website_vs

Next, I need to create a service definition which will tell Windows Azure what my web site looks like. As for now, I will be define a service with a single webrole. So, I create a new file called ServiceDefinition.csdef:

creating_servicedefinition

and I fill in some basic parameters:

created_servicedefinition

Now, I could go ahead and package up the application and deploy it. However, if I do this, the web role will be extremely sick and throw exceptions saying “Unrecognized attribute ‘targetFramework’”, referring to the targetFramework=”4.0” attribute in web.config.

Figuring out why this happens and what to do about it will require some further investigation. For now I just go ahead and delete the attribute. This also means that I need to delete the “using System.Linq;” statements in Site.Master, Default.aspx.cs and About.aspx.cs.

To package the application I need to use the cspack.exe that comes with the Windows Azure SDK. I’ve added the SDK’s bin directory to my PATH, so I can go ahead and package the application:

Packaging

This is a pretty long command, so I’ll repeat it here for convenience:

cspack ServiceDefinition.csdef /role:MyWebRole;WebSite1
/copyOnly
/out:MyWebSite.csx /generateConfigurationFile:ServiceC onfiguration.cscfg
Windows(R) Azure(TM) Packaging Tool version 1.2.0.0 for Microsoft(R) .NET Framework 3.5 Copyright (c) Microsoft Corporation. All rights reserved.

c:\AzureWebSiteTest>

 

The cspack application certainly doesn’t seem very verbose. Anyway, I can now go ahead and deploy the web site to the Azure Development Fabric using another tool from the SDK, csrun:

deploying_to_dev_fabric

And presto! I now have a web site running on the Development Fabric:

website_running_on_dev_fabric

To actually deploy the web site to the cloud you need to create a proper deployment package. To do this, leave out the /copyOnly flag from the packaging command:

Packaging_for_cloud

Again, I’ll repeat the command:

cspack ServiceDefinition.csdef /role:MyWebRole;WebSite1 /generateConfigurationFile:ServiceConfiguration.csfg

This will generate a file called ServiceDefinition.cspkg that you can upload through the Windows Azure Portal along with the ServiceConciguration.cscfg.

Once the Fabric Controller has done its thing we have a web site in the cloud:

website_running_in_the_cloud

I had to cut some corners in the proces, but at least this shows that the web site model _can_ run on Windows Azure.


Resetting the Development Fabric deployment counter

1 Comment »

When you are working with Windows Azure on your local machine, each time you deploy your application to the Development Fabric that particular deployment will receive a unique name along the lines of

deployment(<deploymentnumber>)

Here <deploymentnumber> is a number that is incremented each time you deploy an application to Development Fabric. This may start to look ridiculous after a while, so you may want to reset the counter.

The Development Fabric uses a temp folder,

C:\Users\<your name>\AppData\Local\dftmp,

for storing deployed applications.

To reset the deployment counter go to

C:\Users\<your name>\AppData\Local\dftmp\s0

and delete all the previous deployments. Next, use Notepad to open the file

C:\Users\<your name>\AppData\Local\dftmp\_nextid.cnt

and reset the sole value in that file to 0.

Voila!


TechTalk on Windows Azure on the 22nd of September

No Comments »

I will be giving a talk on Microsoft Windows Azure on the 22nd of September at Microsoft’s Danish headquarters in Hellerup.

Attendance is free of charge, so sign up and get ready for a tour through the Cloud!


Is the Web Application Toolkit for Freemium Applications from Microsoft any good?

No Comments »

Microsoft recently released the “Web Application Toolkit for Freemium Applications” for building applications based on the freemium model. I am not a big fan of the Freemium model myself, but I thought it would be interesting to take a closer look at the toolkit anyway.

First off, let’s recap the freemium model. According to Wikipedia:

Freemium is a business model that works by offering basic Web services, or a basic downloadable digital product, for free, while charging a premium for advanced or special features.

Thus, to build applications for the freemium model, we need to be able to

  1. Have multiple feature sets and manage these
  2. Have users associated with one or more feature sets
  3. Have a user’s chosen feature sets reflect in the application GUI presented to the user
  4. Have a user’s chosen feature sets automatically reflect in the billing

It might also be nice to

  1. have users grouped into categories with catchy names and associate feature sets with each category
  2. be able to easily move users between categories, possibly by self-service

This is what the toolkit should ideally help us achieve.

The tools

In broad terms, the toolkit works on two concepts: features and SKUs, and it uses the ASP.NET Membership functionality to associate users to SKUs. SKUs are sets of features and each SKU has a unique identifier called a slug. SKUs are mapped to users by means of standard ASP.NET Roles having the same name as a SKU slug. Thus, there will be a “Gold” SKU identified by the slug “gold” and all users in the role “gold” are associated to the Gold SKU.

If you buy into this way of grouping features and associating them with users, the framework provides:

  • Page extension methods for displaying/hiding content based on the user’s SKU
  • Action attributes to allow or prevent the user from executing certain actions based on the SKU
  • A MVC 2 Area with controllers and views for managing SKUs
The verdict

The components described in the three bullets above are definitely valuable in any application based on the freemium model. However, their actual implementation leaves something to be desired. The provided functionality isn’t componentized properly and there is a general lack of extension points: to really use the toolkit, you need to buy into a lot of arbitrary design decisions which are unlikely to suit your application. Moreover, there is just a gust of code smell around it: views aren’t properly decoupled from business logic, no DI is used etc.

All in all the Web Application Toolkit for Freemium Applications is a nice initiative from Microsoft and it contains some relevant thoughts. If Microsoft decides to put some effort into the toolkit to provide extension points etc., it may even be able to kickstart your next freemium application. At the moment, however, the toolkit is pretty blunt. It is nothing more than a sample implementation which may provide inspiration but does not deserve to be called a toolkit.

The Web Application Toolkit for Freemium Applications is available on MSDN.


How to trip up a prospective SaaS customer

No Comments »

I was recently searching for a SaaS offering for email management and stumbled upon a product that looked interesting.

The provider’s website frontpage is pretty decent. It presents three main gateways if you want to learn more about the product and its benefits:

  • A video tour
  • A set of video interviews
  • A bunch of third-party reports

I really like the idea of using online videos to present a SaaS product. If you ask me to use 15 minutes to concentrate my thoughts and strain my eyes to read your product presentation, I will consider it. If you ask me to invest 15 minutes in watching a video, I will be happy to kick back and enjoy the show. A video presentation available online 24/7 is a great approach for many SaaS companies.

Thus, I clicked the “Take the video tour here” button and chose the first video from a list. To access the videos, you then need to sign up in the guestbook:

image

“So, just to get to see a set of videos introducing your service, I have to fill in all those fields? And you’ve even added aggressive, red asterisks to the majority of the fields, indicating that they are required and depriving me of any hope that you might let me off easy? Hmmm, I wonder if someone has tweeted anything interesting?”.

The point is: the service provider doesn’t need this information just to show me some videos, so they shouldn’t ask for it! And certainly not require it!

As I mentioned in my previous post, every step the prospect has to take to become a paying customer, is a potential barrier to entry. In this case, the provider has lost me before I even know if their product is fantastic. Once I have checked Twitter, put on another kettle for coffee and gotten back to the computer, I will fire up a new browser, hit Google and restart the search.

Only require information from the user when it is necessary. If you need information like company name etc. to set up a test account, provide defaults. If some piece of information is only required when using a particular feature of your application, only ask for the information when that feature is first used.


Is the SaaS sales cycle just another webshop checkout process?

No Comments »

When delivering a SaaS offering two factors are crucial to profitability

  • Customer Volume
  • Customer Acquisition Cost (CAC)

You obviously want a large customer volume and a low CAC. To this end you will typically employ the power of viral marketing and cheap and scalable online marketing. Hence, you will not have sales people in the field actively herding customers through the sales funnel. Instead, your online initiatives and community need to be able to turn leads into customers autonomously. Actually, the sales cycle is not a sales funnel as much as it is a sales vacuum hose. Customers have to be sucked into your business. Since no-one is around to actively respond to prospects’ whims and wishes and since prospects have a free will (gasp!), the slightest barrier to entry in your sign-up process can have a large impact on your customer volume.

If you read up on some of the lore on designing a webshop and its checkout process, you will see that a lot of energy is put into streamlining the buying process. At every step of the process, a webshop risks loosing customers if they can’t figure out where to go next or if they are just distracted.

When selling SaaS, streamlining the process from initial interest until you can charge the customer is even more important and even harder to get right. SaaS vendors usually offer the option of trying the product for a period of time before buying. This means that there is ample room for something to trip the customer up.

The process of getting someone to sign up for your service looks something like this:

  1. Get them to visit your website
  2. Get them to read about your product and realize that it might address their pain
  3. Get them to try out your product
  4. Get them to sign up
  5. Get them to renew their subscription

Each step in the process is a possible barrier to entry, so you have to think very carefully about each step. Heed the advice of webshop designers, e-commerce consultants and usability gurus, but keep in mind that turning a prospect into a paying SaaS customer is a much more complicated process than making a customer go from A to B in a webshop.

When designing your sales cycle, you should keep the 5 steps above in mind and actively seek to make the transistion between each step as effortless as possible.


Monitoring your StackOverflow status with Python

No Comments »

I recently read this post on meta.stackoverflow.com on monitoring your SO status using Python. Since I’ve been looking into Pyhon (IronPython in particular) lately, I figured it might be fun to try the provided script out in IronPython.

I quickly experienced some problems pertaining to the sqlite3 and urllib2 modules which I was unable to solve, so I downloaded and installed Python 3.1 for Windows.

The original script does not run against Python 3.1 because of some string encoding issues. Moreover, the regular expressions used in the original script no longer match the markup of SO, so I have updated the script somewhat:

from sqlite3 import dbapi2 as sqlite
import re, os, sys, time
import urllib.request as urllib2

questLen = 60 #digits before elipses kick in
connection = sqlite.connect(“C:\\Users\\Rune Ibsen\\Projects\\SO\\profile.db”)
cursor = connection.cursor()

user = #your user id

request = urllib2.Request(url = ‘<http://stackoverflow.com/users/%i/myProfile.html>’ % (user))

profile = urllib2.urlopen(request).read()
profile = profile.decode(“utf-8″)
rep = re.compile(‘summarycount”>.*?([,\d]+)</div>.*?Reputation’, re.S).search(profile).group(1)
rep = rep.replace(‘,’,”)
badge = re.compile(‘<‘+‘div class=”summarycount ar”.{0,50}>(\d+).{1,100}Badges’, re.S).search(profile).group(1)

stQuestion = re.compile(‘Questions</h.*?Answers</h’, re.S).search(profile).group()
mQuestion = re.compile(‘question-summary narrow.*?id=”question-summary-(\d+)”.*?class=”votes”.*?(\d+).*?class=”status.+?(\d+).*?<h3><a.+?>(.+?)</a>’, re.S).findall(stQuestion)
# mQuestion contains tuples containing (id, votes, answers, title)

stAnswer  = re.compile(‘<h1>Answers</h1>.*?<script’, re.S).search(profile).group()
mAnswer   = re.compile(‘answer-summary”><a href=”/questions/(\d*).*?votes.*?>(-?\d+).*?href.*?>(.*?)<.a’, re.S).findall(stAnswer)

stTime = time.strftime(“%Y-%m-%d %H:%M:%S”)

print (stTime)
print (‘\nQuestions (‘ + str(len(mQuestion)) + ‘):’ )
for quest in mQuestion:
cursor.execute(‘SELECT count(id), votes FROM Questions WHERE id = ‘ + quest[0+ ‘ AND type=0;’)
item = cursor.fetchone()
if item[0> 0:
lastQ = (int(quest[1]) - item[1])
if lastQ==0:lastQ=“”
cursor.execute(‘UPDATE Questions SET votes = %s WHERE id = %s AND type = 0′ % (quest[1], quest[0]))
else:
cursor.execute(‘INSERT INTO Questions VALUES(“‘+quest[3]+‘”,’+quest[1]+‘,0,’+quest[0]+‘);’)
lastQ = “(NEW)”
if len(quest[2]) > questLen:
elips=“…” #in case the question is really long
nElips = 0
else:
elips=“”
nElips = 3
print (‘%s%s %s%s’ % (quest[3][:questLen].ljust(questLen+nElips,” “),elips, (“(“+str(quest[1])+“)”).ljust(5,” “), lastQ))
print (“\nAnswers (” + str(len(mAnswer)) + ‘):’)
for answer in mAnswer:
aId = answer[0]
aVotes = answer[1]
aQuestion = answer[2]
cursor.execute(‘SELECT count(id), votes FROM Questions WHERE id = ‘ + aId + ‘ AND type=1;’)
item = cursor.fetchone()
if item[0> 0:
lastQ = int(aVotes) - item[1]
if lastQ==0:lastQ=“”
cursor.execute(‘UPDATE Questions SET votes = %s WHERE id = %s AND type = 1′ % (aVotes, aId))
else:
cursor.execute(‘INSERT INTO Questions VALUES(“‘+aQuestion+‘”,’+aVotes+‘,1,’+aId+‘);’)
lastQ = “(NEW)”
if len(aQuestion) > questLen:
elips=“…”
nElips = 0
else:
elips=“”
nElips = 3
print (‘%s%s %s%s’ % (aQuestion[:questLen].ljust(questLen+nElips,” “),elips, (“(“+str(aVotes)+“)”).ljust(5,” “), lastQ))

cursor.execute(‘SELECT rep, badges, questions, answers , COUNT(date) FROM profile WHERE user = ‘ + str(user) + ‘ ORDER BY date DESC;’)
oldData = cursor.fetchone()
if oldData[4== 0:
oldData = [0,0,0,0]
cursor.execute(“INSERT INTO profile VALUES(%s,%s,%s,%s,’%s’,%i);” % (rep,badge,len(mQuestion),len(mAnswer),stTime, user) )
print (‘\n‘)
print (‘%s Questions, %s new’ % (len(mQuestion),(len(mQuestion) - oldData[2])))
print (‘%s Answers, %s new’ % (len(mAnswer),(len(mAnswer) - oldData[3])))
print (‘%s Reputation (%+i)’ % (rep, (int(rep) - oldData[0])))
print (‘%s Badges, %s new’ % (badge, (int(badge) - oldData[1])))
connection.commit()

Note that

  • You will have to create an empty SQLite database or download one here
  • You will have to insert your own user id (which is easily identified from the URL if you go to your user page).

Running the script will result in something like this:

image