First Windows Phone update released

February 24, 2011 § Leave a comment

Michael Stroh: Starting today, some of you might see something new on your Windows Phone: A message announcing that a software update is available. Woo hoo!

Now, before you get too excited, let me explain: This isn’t the update you’ve probably been reading about or perhaps waiting for, the one with copy and paste (but that’s coming soon).

No, this update is a relatively small one…

 

Use of Identity Property to Resolve Concurrency Issues

February 24, 2011 § Leave a comment

Use of Identity Property to Resolve Concurrency Issues

–  By Ajit Ananthram

 

Recently, I came across an interesting concurrency problem. A database I work with started experiencing intermittent locking and blocking issues. This resulted in timeouts being observed from an application which used the database as its backend.

 

To analyse this, I wrote a script and scheduled it as a job which would execute between a given timeslot the next day. The script would look for blocking in the database and when a block was found, it would log relevant information in a table. This included the SQL routine causing the blocking, the SQL routine being blocked, the statement within the SQL routine that was blocked and the database object that was the source of contention.

The trace Script (Main SQL)

SELECT
   DB_NAME() AS database_name, 
   GETDATE() AS audit_time, 
   s.spid AS process_id, 
   s.blocked AS blocking_process_id,
   s.hostname, 
   s.loginame, 
   s.program_name,
   blocking_s.hostname AS blocking_hostname, 
   blocking_s.loginame AS blocking_loginame, 
   blocking_s.program_name AS blocking_program_name,
   REPLACE(REPLACE(buffer.[text], CHAR(10), ''), 
           CHAR(9), '') AS sql_statement,
   SUBSTRING (buffer.[text], request.statement_start_offset/2, 
   (CASE 
      WHEN request.statement_end_offset = -1 
      THEN LEN(CONVERT(NVARCHAR(MAX), buffer.[text])) * 2 
      ELSE request.statement_end_offset 
    END - request.statement_start_offset)/2) AS specific_sql,
   REPLACE(REPLACE(blocking_buffer.[text], CHAR(10), ''), 
           CHAR(9), '') AS blocking_sql_statement,
   o.[name] AS blocking_object, 
   blocking_tr_locks.request_mode
FROM 
   sys.sysprocesses s INNER JOIN
   sys.dm_exec_connections conn
ON
   s.spid = conn.session_id CROSS APPLY 
   sys.dm_exec_sql_text(conn.most_recent_sql_handle) AS buffer 
   LEFT JOIN sys.dm_exec_requests request
ON
   conn.session_id = request.session_id INNER JOIN
   sys.dm_exec_connections blocking_conn 
ON
   s.blocked = blocking_conn.session_id CROSS APPLY 
   sys.dm_exec_sql_text(blocking_conn.most_recent_sql_handle) 
   AS blocking_buffer INNER JOIN
   sys.dm_tran_locks blocking_tr_locks
ON
   s.blocked = blocking_tr_locks.request_session_id INNER JOIN
   sys.objects o 
ON
   blocking_tr_locks.resource_associated_entity_id = o.object_id 
   INNER JOIN sys.sysprocesses blocking_s
ON
   s.blocked = blocking_s.spid
WHERE
   s.blocked <> 0

The next day, I looked at the data logged by the script. It was evident that most blocks were being caused by the execution of a stored procedure. This procedure made use of a transaction, which was getting blocked and causing the timeouts.

Since many SQL sessions were trying to execute the same stored procedure, one session ended up blocking one or more of the other sessions. The locks were always placed on a single table in the database, and the specific SQL statement being blocked within the procedure was always an update statement on this table.

On closely reviewing the data, I found that this table was created by an application developer to serve as a “Key Value Pair” (KVP) generator. The table contained 2 columns, one for a key and the other for the corresponding value. The key would be a static string and the value would be incremented by 1 each time. This incremented value would be used in several insert statements, which affected many tables.

It was obvious that since one procedure call would update this value and hold on to it until the stored procedure completed and executed a commit or a rollback, the other calls coming from the other sessions would get blocked.

For you to visualise this, I have written some sample code and created a sample table, which are simplified versions of the database objects mentioned above.

Here is the “Key Value Pair” (KVP) table and some sample data

CREATE TABLE [dbo].[tbl_kvp](
 [column_key] [nvarchar](50) NOT NULL,
 [column_value] [int] NOT NULL
) ON [PRIMARY]
GO

INSERT tbl_kvp (column_key, column_value) VALUES (N'PR1', 40)
GO

The stored procedure which was used to increment the value for a given key

CREATE PROCEDURE [dbo].[USP_Get_Value_For_Key]
(
 @key NVARCHAR(50), 
 @value INT OUTPUT
)
AS
BEGIN
 SELECT @value = 0
 UPDATE tbl_kvp SET column_value += 1 WHERE column_key = @key
 SELECT @value = column_value FROM tbl_kvp WHERE column_key = @key
END

The stored procedure causing the contention

CREATE PROCEDURE [dbo].[USP_Business_Process]
AS
BEGIN
 BEGIN TRANSACTION Business_Process
 BEGIN TRY
  DECLARE @val INT = 0

  -- Call key value incrementing stored procedure
  EXEC USP_Get_Value_For_Key 'PR1', @val OUTPUT
  
  SELECT @val -- Print key value for display

  -- There would be code present here in the actual stored procedure
  -- to insert the key value into multiple tables. To simulate this and
  -- also give enough time to review the locks, the transaction will
  -- be kept open for 10 seconds
  
  WAITFOR DELAY '00:00:10'

  COMMIT TRANSACTION Business_Process
 END TRY
 BEGIN CATCH
  ROLLBACK TRANSACTION Business_Process
 END CATCH
END

Trace Script Output

If you were to execute 2 calls to stored procedure, the typical output you would see from the trace script would be as shown below.

 

 

 

As you can see, the stored procedure that updates the KVP table (which gets called in the business process procedure) is blocked by another call to the business process procedure. The update statement for the KVP table is blocked and the source of contention is the KVP table.

Now that I knew the source of the problem, I had to find ways of addressing it. The KVP table had been introduced with several keys in it, and there were many areas in the code which were calling the USP_Get_Value_For_Key stored procedure, thereby affecting not just the business process mentioned above, but several other processes.

I was keen on finding a solution that could be implemented for all keys in the KVP table, without having to modify much of the existing code. This would mean that all processes suffering from this problem would benefit, and only few changes would be required which would make deployment of the fix easier.

Having worked with Oracle, I kept thinking that if I had to implement such an ‘incrementing’ logic, I would probably have made use of sequences. Since SQL Server 2008 doesn’t have sequences, the closest I could get to such an object would be by making use of an identity column (The upcoming version of SQL Server 2011 code name ‘Denali’ does have sequences).

I therefore created a table that had an identity field and a dummy char(1) column into which I would keep inserting a dummy value to generate the next identity value. With this approach, I could convert the single horizontal row for a key into a vertical structure provided by the identity field. Building a vertical structure meant that a transaction could insert records into this new table and hold on to those records, while other transactions were free to insert their own records and get the next value in the sequence.

Here’s some sample code that demonstrates this. First the Vertical table for PR1 key (The seed value is the next key value in sequence)

CREATE TABLE tbl_vert_pr1(id int identity(45, 1), dummy_col char(1))

Here is the modification to USP_Get_Value_For_Key to cope with the vertical structure

ALTER PROCEDURE [dbo].[USP_Get_Value_For_Key]
   (@key NVARCHAR(50), @value INT OUTPUT)
AS
BEGIN
 -- Replaced logic to look up vertical structure table
 -- Naming convention chosen for the vertical tables was tbl_vert_<key>
 
 DECLARE 
  @exec_sql NVARCHAR(4000), 
  @param_defn NVARCHAR(1000), 
  @value_OUT INT
 
 -- By making use of the output clause, the identity value could be 
 -- obtained directly, instead of having to perform a "select max(id)"
 -- operation, which could get blocked if another insert statement was 
 -- holding on to a record as part of a transaction

 SELECT @exec_sql = 
 N'DECLARE @op_table TABLE (id INT); ' +
 N'INSERT tbl_vert_' + @key + N' OUTPUT inserted.id 
   INTO @op_table VALUES ('' ''); ' + 
 N'SELECT @value_OUT = id from @op_table'
 
 SELECT @param_defn = N'@value_OUT INT OUTPUT'
 
 EXEC SP_EXECUTESQL @exec_sql, @param_defn, @value_OUT = @value OUTPUT
END

If you were to now execute two calls to the business process stored procedure in separate concurrent sessions and run the trace script in a separate session, you will observe that no blocks get picked up.

You will also notice the procedure that was executed first fetches the next available key value in sequence, while the procedure that was executed second fetches the key value after the one fetched by the first, thus proving that both transactions could obtain and make use of their respective keys without blocking each other. The above mentioned logic was applied to all keys in the KVP table, which resolved the concurrency issues associated with it.

Conclusion

Every real world concurrency problem tends to have its own unique solution. In the example mentioned above, the solution involved using the identity property, since it could convert a single horizontal record into a vertical structure. The best solution of course, is to not build database objects that do not satisfy good design practices!

 

Introducing Visual Studio Team Foundation Server 2010

February 8, 2011 § Leave a comment

Introducing Visual Studio Team Foundation Server 2010

Introduction

When talking about software development lifecycle and the processes associated with it, many developers immediately think about the waterfall model. In this model, you first design an application, then write and test it, and finally move it to production. However, with today’s agile methods, the development process is different and contains frequent repetition of design, coding and testing (but not necessarily in this order).

To write software professionally, one needs a process. Whether this process is an ad-hoc one or a more formal one, a process is still in place. But to create maintainable code that still works years after the initial version requires both a good process and a sound design.

In almost any field today — software development included — there is lots of mental pressure: time and budget constraints, limited amount of people and changing workloads. The tendency in such environments is to cut corners. But as a software developer, you already know that this is one of the major reasons why so many software projects fail.

Although there is no miracle cure to solve all software development problems, one can get far by using the right tools and processes that have already been thought out. Microsoft has also noticed this, and in 2005 delivered a product suite called Visual Studio Team System (VSTS). The product helped software development companies focus on what they know best: development.

Fast forward to today, and you will find that VSTS has become Visual Studio 2010 with a server-based product called Team Foundation Server (TFS). This server application is in the core of Microsoft’s ALM strategy, and thus worth knowing. TFS matters to both .NET developers and native C++ developers alike. In addition to developers, the product offers many things for testers, graphical designers, architects, project managers, and even customers. TFS can be thought of as being a multi-role product.

What is TFS, Anyway?

Team Foundation Server (TFS) is a server product from Microsoft that allows a development team, among other things, to communicate, share information, file bugs, manage requirements, collect timesheets, and work with source code versions.

Specifically, TFS implements robust version control as core functionality of the server, and on top of that, a multitude of functions. The basic architecture is that a TFS server works along with an SQL Server database (a central store for information) and with one of more client applications. The most notable client application is Microsoft’s own Visual Studio development tool, but plug-ins have been written for other development environments as well, including Eclipse. Also, Microsoft’s own Expression Blend has built-in integration into TFS. Figure 1 shows the TFS architecture on a high level.


(Full Size Image)
Figure 1. The high-level architecture of Team Foundation Server.

TFS server communicates with the outside world using web services over the HTTP protocol. This means that the solution is not limited to a certain geographic location or a single office. Provided that network connections are available, developers and other people in the team can connect to the server no matter where they are.

In addition to advanced version control – much more robust than Visual SourceSafe ever was – TFS offers integration with other Microsoft products to help for instance in document sharing, team collaboration and reporting. That said, TFS can be integrated with SharePoint and Microsoft Project. It uses SQL Server Reporting Services (SSRS) internally to create reports.

Since TFS is a server product, it does not have an end-user user interface. Instead, all the functions that TFS offers are to be used through a client application, Visual Studio being the premium client at this writing. Using the Team Explorer window (Figure 2) in Visual Studio, a developer (or any other team member) can see how the current projects are going, how many bugs are still open, or how many hours of work is still required to get to the next milestone.


Figure 2. The Team Explorer window is the main integration window for working with TFS.

In addition to the Visual Studio integration, TFS can also be accessed through a web based interface (Figure 3). Previously, this was a separate product, but today, it is automatically installed and enabled when you install TFS version 2010. With the web interface, you can do many of the same things you could with the integrated Team Explorer client inside Microsoft Visual Studio.


(Full Size Image)
Figure 3. Team Foundation Server Web Access provides an easy way to get an insight into a team project.

TFS and Team Work

If version control was everything that TFS offered, then there would not be much to distinguish it from other good version control solutions. However, TFS is much more than just version control, and one of the areas where TFS shows its power is team collaboration. One of the most important ways TFS helps team reach its goal is to let everyone in the team know the current status of the project. This is done through what are called work items, which can be considered electronic forms of different types. Common work item types are requirements, bugs, tasks and user stories (Figure 4).


(Full Size Image)
Figure 4. Work items provide key functionality within TFS.

By filling these electronic forms properly, everyone in the team can get a list of active bugs, or query all the tasks assigned to them. For the whole team, TFS can estimate the remaining work based on the information in the work items. Work items can be queried through the Team Explorer window which contains standard queries for bugs and tasks, and also work assigned to an individual team member.

By default, work items are filled directly inside Visual Studio or the TFS web interface. Sometimes however, it would be much more convenient to manage work items using Excel. Luckily, TFS contains a ready-made Excel integration that allows you to download work items into an Excel sheet, edit them, and upload back to the server. This works through a handy add-in that can be installed to Excel 2007 and 2010 (Figure 5). There is also integration with Microsoft Project.


(Full Size Image)
Figure 5. Working with dozens of work items is easy with the Excel add-in.

In addition to work items, information can be shared through a SharePoint portal page. In TFS, each team project can be associated with a SharePoint portal page. File based documents can be easily shared within the team, and features such as document versioning and search are immediately available. The SharePoint portal can be launched directly though the Visual Studio Team Explorer plug-in.

Process Models and Automating Builds

In Team Foundation Server, a team project is logical combination of related work items, files under source control and documents shared through a SharePoint portal. A team project is different from a Visual Studio source code project. In fact, a single team project can contain dozens of Visual Studio projects, such as those written with C#. A team project can also be thought of as being a boundary for reporting, security settings and user rights.

A team project is associated with a process model. By default, TFS comes with two process models (see Figure 6), Microsoft’s implementations of agile methods (called MSF for Agile) and CMMI (MSF for CMMI). A process model in a team project cannot be changed expect by re-creating the team project (with certain exceptions).


(Full Size Image)
Figure 6. TFS users often start with the MSF for Agile process model.

Among other things, a process model defines the work item types available in the project. For instance, the MSF for Agile process model defines six work item types: bugs, issues, shared steps, tasks, test cases and user stories. Work items can be fully customized, and you can add, modify or delete fields in work items. Then, you can save your modifications to be a process template, which can then be basis for new team projects.

A process model also defines the state transitions of work items. For instance, in certain organizations, a bug fix must be confirmed by another tester before the bug can be marked as fixed. Similarly, a developer might not be the person to mark a bug fixed. These kinds of rules, should we say development process rules, can be part of a team project process model. Just like you can customize work items to contain those fields you prefer, you can also customize the state transition rules to the way your organization works. In fact, to benefit most from TFS, you should customize the product to the way your organization works. This work can take much longer – days, weeks or even months – than the software installation of TFS.

If process models can take weeks to hone, a feature called build automation saves time in routine tasks. With build automation, a team can define MSBuild based automated operations to fetch latest code from version control, build it, test and even package the application on a set schedule. Continuous integration is available, too.

For instance nightly builds can be created with a simple setup routine. TFS will from then on automatically deliver new versions for testers to test. Unit tests can be run along with the builds, generating status reports that are directly viewable inside Visual Studio or delivered to developers via e-mail. In Visual Studio, build automation is known by the name Builds.

Who Would Benefit From TFS?

The original Visual Studio Team Foundation Server versions were aimed at teams with at least five members: developers, testers, architects and project managers. Today, all of these different types of team roles are still the focus of Microsoft, but TFS scalability has grown in both directions, up and down. That is, TFS 2010 now provides good support starting from the lone developer with multiple hats to large organizations with hundreds of team members. Except maybe for a couple of largest corporations, the technical platform TFS uses can support almost any organization size.

Licensing-wise, Microsoft gives TFS usage rights to everyone who purchases Visual Studio with an MSDN subscription. The TFS server with a license for five users can be also purchased separately for the MSRP of $499. Additional Microsoft-customary Client Access Licenses (CALs) cost the same amount per user. At this price level, TFS should be within the reach of every professional development shop.

Speaking of technology, especially smaller organizations benefit from TFS’ modest system requirements. It can run on recent Windows versions, both on workstation operating systems such as Windows 7, or on servers (such as Windows Server 2008). An SQL Server 2008 database or later with Reporting Services is required, but this could be the free version that comes with the TFS license. Alternatively, a regular Standard edition works fine.

TFS uses web services, so TFS requires the Microsoft IIS (Internet Information Services) web server role to function. By default, communication is done using HTTP, but this can be changed to more secure HTTPS by simply modifying IIS settings and installing a proper SSL certificate.

Since TFS uses standard Microsoft technology, it can scale well. With modern hardware, a single TFS server can serve hundreds of users, provided that the SQL database runs on a separate server. Largest installations are in the range of several thousand users per server. In simple setups, the technical installation takes usually less than an hour.

Conclusion

Microsoft Team Foundation Server is a versatile product that solves several different needs software development teams have. TFS integrates version control, requirements management and bug tracking, document sharing and team collaboration, robust reporting, web access, controlled access and even customer insight into the team’s work (security settings permitting, of course).

Many of the features in TFS are accessed directly through Visual Studio, meaning that no time is wasted in switching back and forth different tools. Silverlight and WPF developers are happy to learn that Expression Blend can integrate with TFS.

Even though TFS can scale from the smallest to the largest organizations, the system requirements of TFS are easy to fulfill. Installation of the product takes only a short while, but much more time goes to organizational issues. In fact, to best benefit from TFS, organization should crystallize the process it is using, and then fine-tune TFS to suit that need.

A customized TFS quickly becomes an integral part of the organization’s software development process. I’m sure you won’t be looking back.

32-bit Vs. 64-bit Systems: What’s The Difference?

February 6, 2011 § Leave a comment

32-bit Vs. 64-bit Systems: What’s The Difference?

 

Not so long ago, we received a query concerning buying a computer and what all the hype was about concerning 32-bit vs. 64-bit, and so on. Since the gift giving season is rolling around and prices for computer equipment is going down, I decided to try and give you all some help.

32-bit vs 64-bit Computing

To start, let me explain the difference between 32 and 64-bit systems, and the ‘why’ behind it.

In your computer, you have several ‘items’ that, normally, you don’t concern yourself about. One of those is the ‘data buss’. It doesn’t go across town or anything like that, but it does provide transportation. Basically, It connects memory to the rest of the system including the processor, which does all the thinking in your computer.

The ‘data buss’ is used to move the data around inside your computer. In a 32-bit computer, the width (or size) of the data buss is 32-bits wide. A 64-bit buss is twice as wide so the system can move twice as much data around. Being able to process more data means a faster system — but only for specific things. Normal office productivity and web surfing will show no advantages at all, whereas graphics processing and scientific calculations will go much faster.

Processor manufacturers are working out ways to provide 64-bit processors that are faster and cooler running temperatures so you may hear about multi-core processors and other highly technical terms that are related to 64-bit computing. So, as they say, the band is in the bandwagon, playing a bouncy tune, the parade has started, and 64-bit is being touted as the up and coming technology for computing.

Windows 64-bit

However, it is also said that the thing about bandwagons is that it never takes you where you want to go! For example: Windows for 64-bit is not where it should be.

It has been reported that Vista 64-bit, Microsoft’s next Windows release, already has severe problems. It already has had critical updates applied, prior to release. Nothing like getting a head start, is there. Other problems with 64-bit is the general lack of stable software to run on these Ferrari of the computer world. The entire system has to be designed and built for the wider data buss, too, so the system will cost more. On the contrary, most 32-bit software will run on a 64-bit system, but that causes one to wonder why one spent the money in the first place.

Pricing: 32-bit vs 64-bit

One good thing, however, is that the price of the 32-bit systems has been dropping. Looking around the web nowadays is disclosing some really surprising price drops almost across the board, even on laptops.

Already, prices for desktop systems are running about half of what they were this time last year. Over the next few months, those prices will drop even further what with the retail sales rush during October through December. If you need (or want) to buy a new system, this is the time. Start looking for good deals on new equipment now.

It’s also a good time for the used computer market. Lower prices on new systems will cause more used equipment being made available and at rock bottom prices. Don’t ignore them!

Who needs 64-bit?

I suppose the question now is ,”Who would benefit for buying a 64-bit system?” The answer: mostly businesses, universities, scientific groups, and government. If you produce videos, computer art, or develop programs, 64-bit systems will be helpful. But for the home user, 64-bit is currently a bit overkill. You won’t see faster activities like writing, spreadsheet processing, or web browsing so (in my opinion), save your cash for something else (like a bigger hard drive or more RAM).

Have Fun!

 

 

BitTorrent Lawsuits: The Red-Light Cameras of the Internet

February 4, 2011 § Leave a comment

BitTorrent Lawsuits: The Red-Light Cameras of the Internet



By Mike Elgan
The 1993 sci-fi action movie Demolition Man depicted a future without violence or disorder. They achieved this, in part, by using technology to automate penalties for rule-breaking.

A law firm called US Copyright Group was granted permission last year to use the software to identify and sue alleged copyright violators.

The firm is one of several that are spearheading a new way to simultaneously enforce copyrights and collect significant revenue from people who have downloaded movies without paying for them on peer-to-peer (P2P) services like BitTorrent.

Other firms that specialize in mass copyright enforcement lawsuits include the Adult Copyright Company and the Copyright Enforcement Group. Their clients started out as mostly indie film companies, but have recently been dominated by adult movie companies.

The user advocacy group Electronic Frontier Foundation opposes mass copyright infringement lawsuits, saying that the practice “deprives the defendants of fair access to individual justice.”

But if the courts continue to allow such suits, it’s great news for the companies doing the suing, because it brings in huge revenue, intimidates violators and gains new publicity for older movies. There’s really no downside.

There’s a growing sense that filmmakers are beginning to look at mass copyright infringement lawsuits as just another revenue-producing distribution channel. Rather than customers paying first and downloading later, the tactic makes downloaders pay for what they have already downloaded.

The approach differs from that employed years ago by the music industry’s MPAA, which sought to “make an example” out of seemingly random violators in an effort to change the cultural acceptability of downloading music without paying for it.

The new movie model also wants to scare people away from downloads. But they’re also trying to monetize piracy and make real money.

According to the blog TorrentFreak, nearly 100,000 people have been sued for downloading content on peer-to-peer networks like BitTorrent. While many of the cases have been dismissed, some 71 percent are still active. The number of lawsuits file apparently rose dramatically in the last three months of 2010. It’s a growth industry.

As it now stands, according to the site, some 68 cases are still active, involving 70,914 individual alleged downloaders.

One studio called Liberty Media is an industry leader in lawsuit in copyright collections. Rather than identifying and tracking down individual violators, the company is calling for users of the HotFile site to voluntarily turn themselves in to take advantage of a limited-time “amnesty program.”

If downloaders confess during a two-week period beginning Feb 8, fork over $1,000 and promise not to do it again, they will be granted “amnesty” by the company and not be sued. (The maximum penalty in copyright lawsuits allowed by law is $150,000 per downloaded movie.)

Liberty Media even named PayPal in the suit, calling on the online payment company to freeze HotFile’s payments account.

 

Who Cares About BitTorrent Downloaders?

If you don’t use P2P networks to download movies, why should this concern you? The reason is that if the tactic is effective – and so far it appears to be – we can look forward to all kinds of content companies, from TV studios to software makers, jumping on the mass-copyright infringement lawsuit craze.For example, are you violating the many end user license agreements, or EULAs, that you have agreed to and which give you permission to license the software you think you own? Have you read the EULAs? Have you ever read a single EULA?

If companies can automate the discovery of violations of any kind, and add your name to a mass lawsuit, followed by the offer to settle for a few hundred bucks, we could see an explosion in the use of mass copyright infringement lawsuits for software and other kinds of products or content.

The tactic may be especially tempting for a company facing a life-threatening loss of revenue. If you’re going to go out of business anyway, you might as well sue your users.

In any event, an era of computer-enhanced mass copyright infringement lawsuits is upon us, and, while it’s of direct immediate concern to BitTorrent users, it’s something we should all be concerned about in the long run.

 

 

Horror Stories of IT Management

February 2, 2011 § Leave a comment

Horror Stories of IT Management

By Carla Schroder

Tech editor Carla Schroder dissects a recent article about problems with IT management. There’s plenty of finger pointing at workers, but who’s really to blame?

Security fail: When trusted IT people go bad has a great title. Then it’s all downhill. I suppose it’s appropriate for an audience of managers who want cheerleading for bad management more than good information.

It starts off with a tale of ultimate horror: not only is your trusted systems administrator selling you pirated software and incurring the wrath of the BSA (Business Software Alliance), he is running a giant porn server from the company network and stealing customer credit card numbers.

Then it takes the obligatory gratuitous swipe at “rogue” San Francisco admin Terry Childs.

Then we are introduced to Sally, ace IT person who plants logic bombs in all the servers after learning that the IT department is being outsourced to India.

The final horror: Rogue admin, in retaliation for the company busting his secret pirated satellite TV equipment business that he was running off the company’s e-commerce site, deletes the entire corporate encryption key ring even as security personnel are charging into his office shouting “Stop!” 18 staff-years of lost productivity results, because there are no backup copies.

 

Where Am I?

You are currently viewing the archives for February, 2011 at Naik Vinay.