Amazon Web Services Adds Support for ASP.NET and SQL Server

May 31, 2012 § Leave a comment

Amazon Web Services Adds Support for ASP.NET and SQL Server

The cloud computing giant is courting Microsoft developers.

Amazon Web Services (AWS) has announced two new changes that enhance support for Microsoft technologies on its cloud computing services. First, Elastic Beanstalk now supports ASP.NET. Elastic Beanstalk handles deployment details like capacity provisioning, load balancing, auto-scaling and application health monitoring, making it easier to deploy ASP.NET apps in the AWS cloud.

Second, the company has launched Amazon RDS for SQL Server. RDS is AWS’s database administration tool. Previously, it supported MySQL and Oracle, but now it also supports Microsoft’s SQL Server.

How I Would Design a Programming Degree

May 28, 2012 § Leave a comment

Yesterday, I attended ECPI Columbia’s Spring Advisory Board meeting. I was involved in the panel discussion regarding their IT degrees, and I viewed it as an opportunity to explain what I consider lacking in developer education. ECPI is regularly involved in community activities by providing facilities to user groups and code camps, so it was my pleasure to contribute in making their curriculum more valuable to both current and future programmers seeking degrees.

The Developer Degree

There was only one programming degree up for discussion: Bachelor’s in Database Programming. Of course, that meant it had many database courses with minimal impact to the modern software developer; unless you’re coding to a database without any form of data access abstraction. Due to the nature of this degree, I skipped over the database courses and focused my feedback on implementing agile practices where appropriate. For example, in the coding project classes, I think it’s better for the students as a whole to work in an Agile/Scrum manner. I’m not sure how exactly they implement the courses at this time, but if they’re doing waterfall, or they’re avoiding teams, then it’s not as valuable. I know I will have a reader prepared to criticize me for suggesting a practice where the whole team sinks or swims, but that’s the way it is in the real world: either the project is released or it isn’t.

I also recommended eliminating the four object-oriented courses in favor of courses focused on progressive programming paradigms. The criticism in turn was that students should learn different programming languages they can put on their resumes. I suppose that is useful to some degree, but without work experience in the language, it will not help much. The issue I have is that covering Java, Visual Basic, and C# is like learning different dialects of English: their structure is extremely similar, so it’s a stretch to say you really know another language. I don’t program in Java, but I can begin writing class libraries with little difficulty. The primary differences are in the frameworks used, and that’s not even the case with VB and C#.

If I’m hiring a developer fresh out of school for a project utilizing an object-oriented language, I want to know that the candidate understands object-oriented concepts and design. There are many other things I would like in a junior developer, but the language isn’t much of a barrier if the concepts are understood. Of course, my criterion for language experience is different when confronted with a more seasoned developer.

How I Would Do It

 

My experience on this advisory board made me start considering how I would design a degree program that would properly prepare a student to enter the business world as a developer. So, I’m going to list the courses I would require for a software developer degree (not database programmer) in a serialized course setting. If you’re reading this blog and don’t know what serialized means, I’ll assume you’re having one of those ill-caffeinated moments. In the context of a degree program, serialized is where only one class is taken at a time, typically in a very short cycle.

Note: ECPI’s program is required to carry certain courses by their accreditation program, and this is in no way like their course. This is simply me rattling off ideas, and actually implementing my approach would require much refinement.

Course Title Description
Professional Conduct You may not be a decent human being, but you should at least learn to act like one.
Professional Communication You don’t need to have the prose of Faulkner. You don’t even need to know who this Faulkner fellow is. In fact, I suggest you avoid his writing style. On the other hand, your emails and reports should not read like a teenager’s text message.
Critical Thinking You’re never going to make it as a software developer if you can’t analyze concepts with appropriate reasoning. You should also be able to identify when you’re wrong, because guess what: you’re not always right.
Logic Some developers manage to lack a basic understanding of logic. This is the source of some of the stupidest bugs known to mankind.
Imperative Programming With a firm basis in logic, imperative languages will come easily. Don’t be scared by the name, it’s basically conditions and statements, kind of like, “if I study all night, then I will be sleepy.”
Unit Testing Testing your code by running the program will either become tedious or neglected. Automate those tests, and write them up front in the future.
Object-oriented Programming Imperative programming is good and all, but what is this ‘I’ thing and how can it get the adjective ‘sleepy’ by the verb ‘study’?
Declarative Programming No one wants to hear you yap about how you missed class from staying up all night studying. We certainly don’t want to hear about what you were studying. Just tell us you missed class.
Refactoring Your code is ugly, clean it up.
Optimizing You’ve learned how to write your code so any decent developer can maintain it. Now it’s time to make it ugly again. Don’t worry, you already know how to hide your well-optimized, hideous implementations.
Object-oriented Design Patterns So, you came up with an intriguing solution to a problem. Someone else came up with it before you; stop wasting your time “inventing” new things.
Relational Data One day, your boss will ask you for a report. The next day, you will be asked for another, completely different view of the same data. Learning how to store and retrieve relational data will prevent the need of becoming an Excel expert.
Object-relational Mapping Enough data queries and your code will once again become an unreadable mess. Use the techniques of ORM to store and retrieve state.
Human-readable Data Store, transmit, and read data in formats computers, humans, and possibly other biological life-forms can read. Bonus: user interfaces can be specified in a similar manner.
User Interface Design You may not be an artist, but that’s no excuse for creating unusable user interfaces. You should be severely punished for even considering that Nyan Cat’s rainbow elimination can be used as a menu.
Tools of the Trade A dog swallowing your USB drive containing your classwork is sad, but it’s no excuse for not turning it in. Developers have many tools at their disposal to ensure their work is never lost, their software always builds, and their tasks are always known. Sounds horrible, I know, but they also have tools to prevent rote memorization and typing.
Requirements and Specifications If you thought coding was tough, try translating what some people consider English to a logical construct.
Organizing Do you want to be a basement dweller? I thought not. Team work is essential in creating complex, functional software. Knowing the techniques that work for creating continuously deliverable software goes much further than hacking it alone.
Develop Software Now that everyone understands how to organize, how to write specifications, and how to code, it’s time to put those skills to the test. The entire class will form a development team and create software for the project owner (your instructor).

That’s nearly enough credit hours to fill an associate’s degree. There are several options to pad it out, but I think I would go with a targeted math class focusing on operators, their properties, and functions as it would be a nice and useful extension to the class on logic. Discrete math would be valuable as well, particularly since it refers to certain data structures a developer may need to implement, and it contains concepts that are extremely valuable for more advanced programming topics (e.g. combinatorics).

On the Glaring Omission

Notice I didn’t include classes in algorithms and data structures. Generally, if it’s an algorithm deemed worthy enough to be taught from a textbook, it’s either an extremely rare, highly-optimized solution or it has been implemented directly in the default frameworks of common production languages. Useful data structures are almost always implemented in default frameworks (with the exception of the tuple, probably due to its impracticality without language support). Practical knowledge on algorithms is obtained from Imperative Programming, and the most useful data structures are covered by Object-oriented Programming. Let me explain: what better way to introduce linked lists than with the concept of the containment form of composition? Then, you can show the already implemented linked list in .NET: LinkedList<T> (what a surprise!). Afterwards, be sure to explain how almost everyone uses the array-backed List<T> instead. You can take this further in the Declarative Programming class to show that many modern C# developers have taken up immutable manipulation of sequences using LINQ.

I’m not saying that a class on algorithms and data structure is useless; just a waste of time if your goal is to be effective immediately out of school. If I see someone writing their own bubble sort on a project in C#, I would want to know why they’re wasting their time. When I took a position as a junior developer on a Delphi project, the senior developer reviewing my code asked me why I was reading data into an array and sorting it in just that manner. He then introduced me to TStringList, and I never looked back. I recall that I defended myself by stating my manipulations were faster. They probably were, but I wasted far more time and made my code much less readable. The real reason: I transitioned from Pascal to Delphi and was unaware of classes that already implemented the functionality I was used to creating myself. If you’re going to write production code, it’s better to learn the higher-level constructs and implementations available and then learn the lower-level way of doing things for clarification, understanding, and future flexibility.

What is infinitely more useful for the modern application developers is knowledge of design patterns. Consider this: design patterns are “recurring solutions to common problems in software design.” Replace “design patterns” with “algorithms and data structures,” and you see that they fit the exact same definition. The difference is that most named design patterns are still informal, having yet to be implemented in abstract form (formalized) or made part of the language (invisible). Note that patterns that have become formalized or invisible are no longer considered design patterns. Since developers still need to implement design patterns themselves, time is better spent on learning them instead of reinventing them.

I didn’t address this particular subject at the board meeting, but I probably should have; perhaps another year.

Other Classes of Note

Critical Thinking and Logic should really be taught in elementary school through high school. The number of people who can’t properly evaluate propositions or do not analyze their own viewpoints is absolutely staggering. Depending on “common sense” over rational thought is what leads people to make reason errors like the Gambler’s Fallacy. Since the subject material in these classes is essential for rational thought, you better believe they’re essential for software development.

I left out all math classes. There are parts of math that are essential for software developers; algebra is one of them. However, it’s really pieces of standard math and algebra that are important, and it would be nice to have a class focused on those specific parts. When I was in high school, I was told I would need to learn calculus to make it as a programmer. I did learn calculus, and I find it fascinating (including how it was invented by both Newton and Leibniz). However, for the vast majority of development jobs, it is completely unnecessary. Boolean logic carries the day, but it’s also necessary to understand standard operators, the various properties of operators, and functions (a calculation template).

I had no better name for Tools of the Trade, but it is essential to understand common tools used in software development. Of course, the concepts are important as well, so it would be a good idea to include continuous integration and such.

I feel that half a dozen classes on SQL is unnecessary, but you should learn about Relational Data. In the process, you learn relational algebra and tuple relational calculus (not differential or integral calculus). By learning how to manipulate relational data, you learn set logic. I added the Object-relational Mapping course because it’s much more practical than writing large queries. Let your DBAs specialize in that; you specialize in software development. Besides, many startups are using non-relational document databases such as RavenDB.

Human-readable Data is a necessity if you’re writing integrated systems. Guess what, your dynamic website is rendered on a client that calls a server which returns data. You will also most likely integrate both external and internal services in applications you develop, so you need to know JSON and XML. I don’t think these are particularly hard to grasp, so I clumped other markups in here as well for describing user interfaces (HTML, XAML). Markup for UI is human-readable data after all, but you could design entire courses around those technologies. It’s also important to teach style sheets and resources, so I would definitely split it up. The UI portion doesn’t fall under User Interface Design as that is specific to usability. Considering how many developers create awful interfaces, I wish every programming degree program included this sort of class.

Requirements and Specifications should cover things like how to gather requirements and write Gherkin style specs so tests can be designed and the code written without the ambiguity that plagues most shops.

I see Organizing as an Agile/Scrum course.

I clumped functional programming under Declarative Programming. This is accurate, but it doesn’t do functional programming justice. I didn’t put generic programming anywhere despite its heavy usage nowadays, but I think it would fit in Declarative Programming. I know it’s not technically part of the declarative programming paradigm, but it does seem to have a relationship with a sub-paradigm to declarative programming: constraint programming. It would probably be best to add a separate course for generic programming, aspect-oriented programming, and other useful paradigms.

Suggestions?

The courses I laid out are of my own imagining, and it’s really meant to demonstrate the kind of skills I would like developers to have when they leave the university behind to find a job. Unless programming outside of class for sheer passion, fresh graduates seem to grasp little more than an imperative style of coding. They eventually seem to either go the way of the business analyst or gain the practical knowledge necessary to become awesome developers. This tells me that 1) more people should be pursuing business analyst degrees, which could be corrected by offering 2 and 4-year business analyst degrees, and 2) Developer/software engineering (not CompSci) degrees should focus on topics more relevant to modern developers.

How would you craft a developer degree?

(Source : http://www.kodefuguru.com)

The Future of Algorithms

May 18, 2012 § 4 Comments

Article by David Hunter

Algorithms are taking over the world – at least the computational part of it- and that could be a good thing.

In a real sense the rise of algorithms is a sign of human intellectual maturity in terms of our capacity as a society to manage technology and science at a sophisticated level; representing the coming together of our mastery of computational science, together with the capacity to abstract the key essence of a process- to generalise and commoditise it.

The ubiquity of algorithms is in fact the next logical step in our technological evolution as a species and perhaps marks our evolution towards super-species status.

Algorithms translate a process into instructions that a computing machine can understand, based on a mathematical, statistical and logical framework. They are usually developed to minimise and rigorise  the computing steps involved in a process or formula and therefore maximise its solution efficiency in terms of computing resources, while at the same time improving its accuracy and verifiability.

Algorithms come in all shapes and sizes and have been around a long time- well before the official computer age. Originally invented by Indian mathematicians, they were documented by a Muslim scholar of the 9th century- al-Khwarizmi and later applied by Euclid and Newton to assist in the formalisation of their theories of geometry and forces of nature.

In the future almost every process or method will be converted to an algorithm for computational processing and solution, as long as it can be defined as a series of mathematical and logical statements, ideally capable of being run on a Turing machine.  

A Turing machine is a mathematical model of a general computing machine, invented by Alan Turing, which we use for our current computing requirements.  Turing machines can come in a variety of flavours, including deterministic,  quantum, probabilistic and non-deterministic, all of which can applied to solve different  classes of problems.

But regardless, any computation, even those based on alternate logical models such as Cellular Automata or recursive programming languages, can also theoretically be performed on a Turing machine. The brain however, because of its enormous non-linear problem-solving capacity, has recently been classified as a Super-Turing machine, but the jury is still out as to whether it falls in a different computational class to the standard Turing model. 

Many algorithms incorporating powerful  standard mathematical and statistical techniques, such as error correction, matrix processing, random number generation, Fourier analysis, ranking, sorting and Mandelbrot set generation etc,  were coded originally as computational computer routines, using languages dating from the 50s and 60s including- Fortran, Algol, Lisp, Cobol, C++ and PL1. Later common algorithms were also incorporated in mathematical libraries such as Mathematica making them easier to access and apply.

They have now infiltrated every application and industry on the planet, applied for example to streamline and rigorise operations in manufacturing, production, logistics and engineering. They cover standard operational control methods such as linear programming, process control and optimisation, simulation, queuing, scheduling and packing theory, critical path analysis, project management and quality control.  

Engineers and scientists increasingly link them to AI techniques such as Bayesian and Neural networks, Fuzzy logic and Evolutionary programming, to optimise processes and solve complex research problems. 

But over time, following the flow of computerisation, the ubiquitous algorithm has extended into every field of human endeavour including- business and finance, information technology and communication, robotics, design and graphics, medicine and biology, ecosystems and the environment and astronomy and cosmology; in the process applying data mining, knowledge discovery and prediction and forecasting techniques to larger and larger datasets.

Indeed, whenever new technologies emerge or mature, algorithms inevitably follow, designed to do the heavy computational lifting, allowing developers to focus on the more creative aspects.

Other algorithmic applications now cover whole sub-fields of knowledge such as- game theory, machine learning, adaptive organisation, strategic decision- making, econometrics, bioinformatics, network analysis and optimisation, resource allocation, planning, supply chain management and traffic flow logistics.

In addition, more and more applications are being drawn into the vortex of the algorithm which were once the province of professional experts including- heart and brain wave analysis, genome and protein structure research, quantum particle modelling, formal mathematical proofs, air traffic and transport system control, weather forecasting, automatic vehicle driving, financial engineering, stock market trading and encryption analysis.

A number of such areas also involve high risk to human life, such as heavy machine operation, automatic vehicle and traffic control and critical decisions relating to infrastructure management such as dams, power plants, grids, rolling stock, bridge and road construction and container loading.  

The Web of course is the new playground for algorithms and these can also have far reaching impacts.

For example in 2010, the Dow Jones Industrial Average dropped 900 points in a matter of minutes in what is now known as a Flash Crash. It appears that for a few minutes several algorithms were locked in a death dance, in much the same manner as two closely bound neutron stars before implosion, triggering a massive collapse in the value of the US stock market. It was a wake-up call to the fact that in any complex system involving multiple feedback loops, it has been mathematically proven that unforeseen combinations of computational events will take place sooner or later.

Even today’s news headlines are shaped by algorithms. Not only is it normal for Internet users to select feeds relating to the personalised content they prefer, perhaps on a feel-good basis, but also stories are selected and curated by search-engine algorithms to suit categories of advertisers. This raises the issue of algorithms being applied to create different bubble realities that may not reflect the priorities of society as a whole- such as global warming, democracy at risk or a critical food shortage.

A major dimension of the impact of algorithms is the issue of job obsolescence. It is not just the unskilled jobs of shop assistants, office admin and factory workers, marketing and research assistants that are at risk, but middle-class, white-collar occupations, from para-legals to journalists to news readers. As algorithms become smarter and more pervasive this trend will extend up the food chain to many higher level management categories, where strategic rather than operational decision-making is primary.  

And so we come to the millions of smartphone apps which are now available to support us in every aspect of our daily activities, but can also lead us to the dark side of a big brother society, where through the pervasive monitoring of location, shopping transactions and social connections,  every individual’s life and timeline can be tracked and analysed using algorithms, with everyone eventually becoming  a person of interest in the global society.

Social networks trade personal information to generate revenues, while the individual loses their right to privacy but doesn’t receive any compensation. Certainly the area of apps governing personal and lifestyle choice is now being invaded by ubiquitous algorithms in the form of Recommendation Systems. Much of the information garnered from social networks is filtered and personalised to guide lifestyle  and entertainment; selecting an exercise regime, a relationship, online book or author,  and restaurant or movie choice based on past experience and behavioural profiles. And already a third of US shoppers use the internet to make a buying decision.

These subliminal recommender systems represent the beginning of an always-on individual omnipresence, tracking your car on a GPS or recognising your face in a photograph, now combined with an AI generated virtual assistant such as Siri. More recent algorithms also have the potential to combine information to infer further hidden aspects of lifestyle.

But the real problem with such Recommender systems is their poor record at forecasting, particularly in areas of complex human behaviour and desires. And in addition the inner logic governing their Delphic predictions is generally hidden and opaque; meaning that guesswork is conveniently covered up while decision-making becomes dumbed down.

Enterprises such as banks, insurance companies, retail outlets and Government agencies compete to build algorithms to feed insatiable databases of personal profiles; constantly analysed for hidden consumer patterns to discover  who is most likely to default on a loan, buy a book, listen to a song or watch a movie. Or who is most likely to build a bomb?

The rise of the algorithms embedded in our lives could not have occurred without the surge in the inter-connected, online existence we lead today. We are increasingly part of the web and its numerous sub-networks, constantly in a state of flux. A supermarket chain can access detailed data not only from its millions of loyalty cards but also from every transaction in every branch.

A small improvement in efficiency can save millions of dollars. The mushrooming processing power of computers means that the data collected can be stored and churned continuously in the hunt for persons of interest. So who is to stop them if consumer groups aren’t vigilant?

This is not too much of a nuisance when choosing a book or a movie, but it can be a serious problem if applied to credit rating assessment or authorisation of healthcare insurance. If an algorithm is charged with predicting whether an individual is likely to need medical care, how might that affect their quality of life? Is a computer program better able to calculate kidney transplant survival statistics and decide who should receive a donor organ?

Algorithms are now available to diagnose cancer and determine the optimum heart management procedure using the latest worldwide research. Can human doctors compete in the longer term and will algorithms be better at applying game theory to determine the ethical outcomes of who should live or die?

The ethics of data mining is not limited to privacy or medical issues. Should the public have more control over the application of algorithms that guide killer drones towards human targets.  Eventually computer-controlled drones will rule the skies, potentially deciding on targets independently of humans, as their AI selection algorithms improve. But if an innocent civilian is mistaken for the target or coordinates are accidently scrambled, can the algorithm be corrected in time to avoid collateral damage? 

So algorithms must have built-in adaptation strategies to stay relevant like every other artifact or life form on the planet. If not they could become hidden time bombs. They will require ultra-rigorous testing and maintenance over time because they can become obsolete like any process governed by a changing environment – such as the y2k computer bug and the automatic trading anomaly.

If used for prediction and trend forecasting they will be particularly risky to humans. If the environment changes  outside the original design parameters, then the algorithm must be also immediately adapted. otherwise prediction models and simulators such as the proposed FuturICT global social observatory might deliver devastatingly misleading forecasts.

As mentioned, a number of artificial intelligent techniques depend on algorithms for their core implementation including: Genetic algorithms, Bayesian networks, Fuzzy logic, Swarm intelligence, Neural networks and Intelligent agents.

The future of business intelligence lies in systems that can guide and deliver increasingly smart decisions in a volatile and uncertain environment. Such decisions incorporating sophisticated levels of intelligent problem-solving will increasingly be applied autonomously and within real time constraints to achieve the level of adaptability required to survive, particularly now within an environment of global warming.

In this new adaptive world the algorithm is therefore a two-edged sword. On the one hand it can create the most efficient path to implementing a process. But on the other, if it is inflexible and incapable of adapting, for example choosing to continue to manufacture large fossil fuel burning vehicles, it can lead to collapse, as in the case of Ford and GM. 

Good decision-making is therefore dependent on a process of adapting to changes in the marketplace which involves a shift towards predictive performance management; moving beyond simple extrapolation metrics to a form of artificial intelligence based software analysis and learning, such as offered by evolutionary algorithms.

Life depends on adaptive algorithms as well – assessing distance to a food source encoded in the dance of a bee, determining the meaning of speech or acoustic sounds, discriminating  between friend or foe., the ability of a bird to navigate based on the polarisation angle of the sun or a bat avoiding collisions based on split-second acoustic calculations.

These algorithms have taken millions of years to evolve and they keep evolving as the animal adapts in relation to its environment.

But here’s the problem for man-made algorithms. Very few have been designed with the capacity to evolve without direct human intervention, which may come too late as in the case of an obsolete vaccine or inadequately encrypted file.

The rate of change impacting enterprise environments in the future will continue to accelerate, forcing the rate of decision making to increase in response autonomously, with minimal human intervention. This has already occurred in advanced control, communication and manufacturing systems and is becoming increasingly common at the operational level in e-business procurement, enterprise resource planning, financial management and marketing applications, all of which are dependent on a large number of algorithms.

Dynamic decision support architectures will be required to support this momentum and be capable of drawing seamlessly on external as well as internal sources of knowledge to facilitate focused decision capability.

Algorithms will need to evolve to underpin such architectures and act as a bulwark in this uncertain world, eventually driving it without human intervention; but only if they are self- verifying within the parameters of their human and computational environment.

Where Am I?

You are currently viewing the archives for May, 2012 at Naik Vinay.