comment 0

Software Verification and Software Validation – What’s the Difference?

What is Software Validation?

Validation can be described as verifying that you are building what the customer wants. Software validation is the process of ensuring that your application meets functional and non-functional requirements before coding and during development. In general, validation is verifying that the customer wants a bike before you put together a car. In software, validation is verifying that the customer wants accounting software before building a data management tool. In short, validation is finding out what the customer wants before you start coding. Software validation takes place primarily during the system requirements analysis and design phase. Software validation should be done before you start writing code.

What is Software Verification?

Verification if verifying that the product is being assembled correctly, after you have determined what you are supposed to build. You are building a bike. The verification is checking that you have all of the parts of the correct sizes to build it, from the right length frame to the correctly sized seat. Software verification of an accounting tool would involve ensuring that the accounting register and check generation works per customer requirements before considering the coding complete. Software verification takes place during the implementation, integration and testing phases. Software verification should be completed before software release.

Here’s The Difference

Software validation is determining whether or not an interface with a website or another software application is necessary. Software verification is ensuring that the interface works appropriately after it is “built”. When software requirements creep or expand beyond their original scope, a new requirement for software interface is added. After the new software interface is built, it must be verified to ensure that information flows between the two applications without errors.

Common Causes of Confusion

Scope creep of a project, such as the addition of new requirements after the scope was decided, essentially moves the software validation phase into the software verification phase. Adding new software requirements adds cost and time to the software development process.

How to Avoid the Confusion
1. Include all stakeholders in the requirements definition phase.
2. Have a strict definition of the requirements definition phase. Refuse to let new requirements creep in once coding begins, even if someone promises that it is a minor change.
3. Determine where the data must come from and go as part of the typical use of the tool. If the data interface is frequently used, include the ability to transfer the data between those applications or databases a requirement and include the testing of these data transfers in software validation.
4. Focus software validation efforts on core functions over optional enhancements if crunched for time. Make sure the critical functions work, over the new features you can decide not to go forward with if you don’t have enough time to test it.
5. Recognize that functional applications are a requirement, not an enhancement. Prioritize testing of functional requirements over cosmetic ones; a report with awkward to read headers is an annoyance, but data imports and accurate reports are essential.

comment 0

Eternal Y2K Lessons

When NASA reported that we weren’t all going to die in September, 2015, I had flash backs to the 2012 hysteria and Y2K fears that combined technological disasters with the Rapture. Obviously, the world didn’t end, but it provided some interesting lessons that I still use today.

These events occurred early in my career, nearly straight out of college. Names (except my own) and places have been changed to protect the innocent and avoid being charged as guilty of making fun of the employer.

I was learning to enter production orders into our MRP system. When I entered a resin processor equipment order with a delivery date of six months out, it processed and came up as 99 1/2 years late. I chalked it up to my own mistake. Then resin orders for almost a year in advance, fall 2000, showed up as 99 years late. I asked one of the purchasing staff what I was doing wrong. “Mary” said, “Well, I’m ordering equipment and resin stuff for next year, and I’m getting the same mistake.” Ah, so it wasn’t just me. So we took it to the manager, who took it to IT.

Welcome a true Y2K nightmare.
Our MRP software wasn’t Y2K compliant. Worse yet, our MRP software was no longer supported by IBM. The option were:

1. Buy big name software package.
2. Buy little name software package.
3. Make our own software.
4. Deal with it.

Since it was summer 1999, option 1 wasn’t an option. All the big name software vendors were too busy to talk to a small sized manufacturer, while freelance programmers made a fortune fixing the calendar bug so common that a few people thought it would shut down the computing world. Small software vendors were willing to talk to us, if we paid a great up front fee to get them to show up. Given the limited purse strings of the manufacturing firm, paying for a software vendor to grace us with their presence – but not necessarily solve the problem in time, was not an option.

Thus option 3, making their own MRP software in house, was the option that was selected. A computer science graduate straight out of college was hired to develop a new MRP software application. And, amazingly, one was developed by October and installed.

The training session was by a guy trained by the guy who developed the software, then left to make a fortune in last minute Y2K contracting. I asked so many questions that the 30 minute training session lasted 1 1/2 hours. I took detailed notes, since they had not had a single handout on how to use the software. Many questions I had couldn’t be answered, except with the answer that only the programmer would know and he wasn’t here. Then came recommendations from the programmers to try the software after I installed it and see if what I thought would work did, indeed, actually work. Frightening thought, when you think about it. Beta testing should be done before production releases.

When the training was over, I typed up my notes in a Microsoft Word document. Since there was no hand out to the users, I wanted to create my own for reference. I e-mailed the notes to the trainer with the single line: “Is what I wrote correct?”

Two hours later came a mass e-mail to the whole company, with my notes attached. The only line in the message was: “Here’s the user guide for the new MRP software”. Thus began my technical writing and IT documentation experience.

Thus the new homegrown application was “released”. There were minor bugs, but it was Y2K compliant. Orders placed 18 months in advance showed as over a year out, as they should. It seemed the Y2K bug was fixed on our system.

January 2nd was filled with trepidation. No one was actually in the office January 1, 2000, to see if it rolled over correctly. If civilization was going to collapse, there was no point babysitting computers.
The first work day arrived. All our computers turned on. All the equipment turned on. The additive feeders fed additives at the right rate. Level measurers showed correctly. None of the mechanical devices ripped out of the wall to chase employees down the corridors, a la The Simpsons Halloween special. Disaster averted.

Then I went down to the process control center. There was a long string of blinking lights. The date display on all the equipment controllers was flashing 00/00/00. The one software system no one had bothered to check. Individual units were too stupid to care what the date was; the central controller unit, though, had no idea what to do.

There were three eternal Y2K lessons learned from the experience.

1. Figure out all the software that needs to be fixed before you begin trying to fix it. This is otherwise known as requirements definition.
2. When you’re writing – or rewriting – software, document how to use it before you try to train people how to use it or expect them to use it
3. Make sure the trainers are thoroughly trained in the software before they train users. If the trainer cannot answer questions, the questions will filter up to development. And that is not a value added proposition.

Software Features – Quality, Quantity and the Difference – Updated

Software quality is not necessarily measured by the sheer number of features.

The only relationship between the two is a few metrics that measure quality by the number of features or modules of code and the number of defects. In that regard, adding a few features that work (or classifying a few operations as features) improves the quality of the software because the number or percent of defects declines, assuming what you add works.

Increasing the number of features does not necessarily improve the quality of a product. For example, adding more compatible data formats only improves the quality of the software if many users couldn’t get clean data imports until these data formats were made compatible. After this point, building in file conversions is a convenience but not a necessity.

Computer aided design software must allow drafters to create 2D and 3D designs. They need to be able to create lines, shapes, surfaces and models. The ability to add surface effects, create animations and introduce lighting effects is additional features, but they do not improve the CAD software’s quality. The CAD software quality will be measured in the use of the software and details such as whether intersections are clean and without errors when 3D models are assembled together, whether parametric modeling properly sizes items and if data is never corrupted by imports, exports and multiple saves by a product development team.

What features improve software quality?

• Anything that makes the user’s life easier, from simplifying process steps to eliminating tedious data entry
• Faster performance that saves users or system administrator time
• Functionality that simplifies or automates data transfers and inputs, such as automating the imports of customer data into your MRP system or updating your accounting software with transactions downloaded from a bank website
• Functions that allow users to correct their mistakes without having to go back to a saved file and recreate the intervening steps; automatic backups and the “undo” button are good examples of this
• Time saving features based on most commonly performed tasks as long as the software’s defaults can be over-ridden; for example, flagging possible duplicate records makes record clean up simpler, but software that automatically combines likely duplicated records creates a mess when two people with the same name find their medical records mixed up in one file
• Mistake-proofing features that prevent mistakes that require time and effort to correct
• Built in help that avoids time and possibly the expense of contacting application support
• Features that ensure data accuracy and record completeness improve data quality and the software’s value to users

When developing the requirements or code for software or any other IT product, remember that quality refers to how well the product helps the user perform the IT product’s core function quickly, easily, cheaply and correctly. Everything else is simply a bonus, and it is irrelevant if the bells and whistles interfere with the product’s primary purpose.


ESTIEM Vision Hamburg – “Maritime Food Logistics”

Moin Moin, dear readers!

We’ll let you know a little bit more about our awesome experience in Vision Maritime Food Logistics in Hamburg. After arriving by the Portuguese time (a little late in the evening), we had the chance to meet everyone while playing ‘get to know each other’ games.

Throughout the week, we had an ESTIEM presentation, where all the committees, initiatives and projects were introduce to us. We also had the first contact with harbors, their logistics and as an example, we learnt about how bananas enter Europe!

There, we visited China Shipping Company, one of the world’s largest integrated international container transportation and logistics, the Hamburg Harbor, the Maritime Museum, where we pretend to be Jack Sparrow, Captain Jack Sparrow, when we controlled and CRASHED our vessel (don’t worry, it was just a simulation). Additionally, we visited Still, which provides customized solutions for intralogistics. There we visited the ground floor, observed closely their systems and production line and, at the end, we solved a case study about the Hamburg Harbour.

Ups, we forgot about the all the Parties and Leisure!! During that week, we fell in love with both days and nights of Hamburg. We visited the city, its monuments, a Brewery, the Red Light District- Reeperbahn, the Uni where we had some amazing parties and the Fish Market while it was raining cats and dogs!

It was a very amusing week, with some great people and awesome work of the organisers! If you have the chance, go to a Vision, you’ll have an unforgettable event!

In High Estiem,
Maria Afonso & Filipe Rocha, LG Porto