Quote

"Between stimulus and response there is a space. In that space is our power to choose our response.
In our response lies our growth and freedom"


“The only way to discover the limits of the possible is to go beyond them into the impossible.”


Wednesday, 28 November 2012

Creating a Data Dump and Using it Across Oracle DB Servers

We came across a use-case where we needed to pass on the data dump to a new environment that was to be created for public domain. The database servers being used was Oracle 11g. The overall process became simple by using the DATA PUMP utility. Oracle provides EXPORT and IMPORT utility known as DATA PUMP that can be stopped and restarted. Thus providing the ability to detach and then reattach to huge jobs enables the DBA's to multitask. The utility also enables to estimate that how much space an export job would consume without actually doing the export. 

Using the Export utility of Data Pump

To use the data pump  a directory needs to be created in the server machine and a Directory Object needs to be created in the database which needs to be mapped to the directory created in the file system of the server. The steps are as follows:

1) Create a directory in the server file system

          $mkdir saty_data_dir

2) Create a mapping to the file system directory and Directory Object
  • login to sqlplus
  • ./sqlplus sys as sysdba
  • SQL>create directory data_pump_dir as '/usr/satyam/saty_data_dir'
  • SQL>grant read,write on directory data_pump_dir to dbuser
 3) Export the data dump to the file


          $./expdp dbuser/password@SID schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log

myfilename.dmp will be created in the /usr/satyam/saty_data_dir

Using the Import utility of Data Pump

The following steps need to be performed on the second server:

1) Create a directory in the server file system

          $mkdir saty_data_dir

2) Create a mapping to the file system directory and Directory Object

  • login to sqlplus
  • ./sqlplus sys as sysdba
  • SQL>create directory data_pump_dir as '/usr/satyam/saty_data_dir'
  • SQL>grant read,write on directory data_pump_dir to dbuser

 3) Copy the myfilename.dmp to the /usr/satyam/saty_data_dir from the first server to the second  

 4) Import the data dump to the file

          $./impdp dbuser/password schemas=myschema directory=data_pump_dir dumpfile=myfilename.dmp logfile=myfilename.log

Now you can check that the tables and the other data has been moved to the new server.

Monday, 5 November 2012

Mobile Application Development and Quality Challenges

Emergence and the Rise of the Mobile Applications

The introduction of the technology has been enabling the delivery of traditional services in new dimension and new ways. For example in the banking sector the ATMs came into existence in 1970’s, followed by Internet Banking in the 1990’s and post 2010 banking has found the new platform i.e. the mobile device aka smart phone.

The mobile application world has transformed from a fun and cool possession of the young and the enthusiastic to being a business critical need of the techie business personnel. It is still seen as a Value Added Service but likely to become the defacto medium of delivery of almost all the services and products in near future.

While you are reading this you might have already made your fund transfers through a mobile device or might be planning to possess the capacity to do your business on the move without having to wait for your laptop to boot or while you are on a vacation where you do not want to carry the epitome of work i.e. your laptop with you. User empowerment has increased exponentially with the anytime anywhere capacity being passed on to the end user while on the go.

As per the estimates of the industry experts only the enterprise mobility business is supposed to be close to 180 billion by 2017. The spending aims to optimize the business cycles by pushing them to real time 24X7 capacities without the restrictions of location and specific network dependencies.

Managing the Enterprise Mobility Space


Enterprise mobility has made clear space for itself in the IT business world and new concepts and principles are being laid to imbibe enterprise mobility in business. These new concepts and principles are needed to ensure that the mobility growth is efficient, effective and in line with the existing values and propositions of the business. Some of the new terms that have cropped up with the advent of enterprise mobility are as follows:

MDM (Mobile Device Management)

A software suite used by companies to control, encrypt, and enforce policies on mobile devices such as tablets and smart phones of their users.

MAM (Mobile Application Management):   

A software suite used by companies to control, encrypt, and enforce policies on the applications installed on the mobile devices such as tablets and smart phones of their users.

MIM (Mobile Information Management)

A software suite used by companies to control, encrypt, and enforce policies on the data to be synced to the mobile devices such as tablets and smart phones of their users.

BYOD (Bring Your Own Device)

Is a concept where the end user can bring their own device and hook it to the corporate resources, such as network, machines and applications. MDM, MAM, and MIM are the suggested enablers for implementing BYOD; however it is not necessary to use them together.

BYOA (Bring Your Own Application): 

Is a concept that promotes the use of third party applications and cloud services in the workplace. Security, Compliance and Regulatory concerns are the primary risk areas which are diminished using an appropriate combination of MDM, MAM, and MIM.

The above guiding principles help regulate the enterprise mobility space to be assimilated into the existing constraints required for business functions. These bring opportunities and risks along with them and depending on the need, vision, and type of business they are partially or fully being adapted in the enterprise world.

Enterprise Mobile Application Development and Testing Challenges


Ever Evolving Mobile Technologies


Even though it has been some time since the arrival of the mobile technologies but they are far from being standardized. Every other day we hear some new announcement and new releases coming across in one platform or the other. The competitive battle for bringing out the best and before anybody else has even reached courtrooms in an effort to establish dominance in the market. The rush to push the latest technology across to the end user has resulted in never before empowerment and benefits of the end user and in the process the business houses have benefited as well.

Shorter Go to Market Time with Latest Features


The rush of evolving technologies has always kept the development and testing groups on their toes. The dual pressure of optimizing the newly established delivery process and including the latest feature and functionality in the application keeps the team busy with increased development/testing and re-development/re-testing efforts. The pressure of reducing the time to market is more than in any other stream of technology.
For non enterprise mobile applications where the majority of functions are not dependent on the back office operations the delivery cycles are as short as couple of months. But the bigger challenge is for enterprise applications to put the core enterprise functionality developed over years to be put in the plate of the user on a mobile device in a relatively short cycle and with relatively small teams.

Increasing Mobile Platform Base


In the recent past the mobile technology was only for the top brass of the management and with limited platform players in the market. The organizations had to primarily choose from only two players i.e. the APPLES and the BALACKBERRIES of the world. Now Android and WindowsMobile have already eaten considerable market share of the prior leaders even though it looks that they have just started to expand their base. With the dilution of user base to the lower management and opening up of the consumer application market the platform and device base has risen exponentially.
From supporting few devices in one of the platforms now the enterprise applications have to support all premier platforms and many more primary devices in each platform. The premier fighters in the platform market are Apple, Blackberry, Android and WindowsMobile. Each platform player is acquiring (Google eating Motorola) and tying (Samsung flirting on WindowsMobile) with OEMs (Original Equipment Manufacturers) to dish out more and more devices in the market. Both the consumer and the enterprise application market are under constant pressure to support all or most of the new released devices. The product team and the delivery teams are always in a quandary as to what platforms and how many devices to support.

Delivery teams are looking towards alternatives, such as MEAPs( Mobile Enterprise Application Platforms) to counter the challenges of catering to an increasing mobile platform base and to keep the cost in control and to reduce the time to market. MEAPs tend to offset the challenges posed by huge development/maintenance/upgrade effort required in native platform environments, by using the “Write Once and Deploy Everywhere” paradigm. MEAPs abstract the native platform implementation complexities enabling the development team to focus on the business functionalities.

However, the picture in the MEAP world is not as rosy as it looks. It has its own constraints and dependencies due to which the team may not be able to utilize the full potential offered by the native platforms. It’s a tradeoff between effort required to port to new platforms and having full control over platform features.

Integrating with Mainstream Enterprise


For availability of enterprise applications on mobile devices a huge integration effort is required. Enterprise applications, such as CRM, ERP, and billing require integration with the mobile application to enable the end-user access them from mobile devices. The integration effort needs to make sure the access to the related components, such as database, authentication, and data integrity are through a secure and well defined interface.
Without proper and thorough testing the application deployments are likely to fail to meet the user requirements and may cause project failure despite the application working perfectly without integration. It is a huge challenge to create test simulators for testing mobile application if the interface for interacting with the back-office/enterprise application is not well defined.

Addressing Security Concerns


Security is the biggest deterrence to widespread adaptation of enterprise mobile applications among the masses. It is a huge risk to put business critical and confidential data to be accessed via enterprise mobile applications. Several new techniques, such as MDM (mobile device management), MAM (mobile application management), and MIM (mobile information management) are being put in place to put greater control over the mobile echo-system for enhanced security. Several other security threats loom over the enterprise mobile applications, such as hacking via platform/network loopholes, loss of mobile devices, and mobile malwares. As per a recent study mobile malwares are doubled every year. Few examples are applications collecting personal information, such as passwords and browsing history, and sending SMS or making calls to paid numbers.

Mobile security threats can be countered by following few best practices, such as:

o       Confirming PCI DSS compliance and SOX
o       Avoiding client-side data storage to maximum possible extent
o       Enable data protection mechanism in data transit
o       Strong authentication mechanism for accessing resources
o       Implement least privilege authorization policy
o       Protection against client side injection and denial of service
o       Protection from third-party malicious code using MAM
o       Implementing strong server side controls


Lack of End-to-End Testing/Verification Frameworks:  


Over the years competing technologies and tools have been developed and stabilized to meet the testing and verification needs of various enterprise applications. However due to quick emergence of mobile echo-system from diverse platforms and diverse equipment manufacturers there has been no standard tool/framework to cater to the diverse testing needs of the enterprise mobile applications.  
End-to-end verification of enterprise mobile applications requires an in-depth understanding of the enterprise operations and expertise in mobile technologies. It is not possible to test/verify the entire spectrum of the enterprise mobile applications using one tool. A well designed automation strategy is imperative for supporting the manual testing efforts of certifying the application on a broad device and platform base. A combination of testing tools/frameworks available for enterprise and mobile application testing needs to be used.

Testing/Verification Approaches

There are several tools available that can be used to the bridge the testing requirements of the enterprise mobile spectrum. The testing can be done locally on the desktop using emulators, or physical devices or it can be done on a remotely using a remote cloud service. Remote cloud services, such as DeviceAnywhere and PerfoctoMobile help enable get access to device labs and the local network on which the application is to be deployed. Scenarios where network testing is required these tools are very helpful. However, local testing should be the preferred method of testing for full functional regression of the mobile application. For local testing the tools can be broadly categorized on the basis of their approach to verify mobile applications.  The tools available for mobile application testing can be classified as:

o       Native object recognition based: Object recognition tools identify the elements of the application as objects, such as Text Box, Labels, Buttons etc and their properties, such as disabled, enabled, hidden etc.
o       Image/Co-ordinate recognition based: Image based tools identify application components as images and try to locate them using relative or absolute co-ordinates.

Advantages of Image based automation tools

Majority of the tools available in the market are image based tools because of some advantages, such as:
o       Image based tools are application technology independent
o       Image based tools are compatible with all the mobile platforms
o       Easy to learn and implement
o       Script re-usability across platforms if the UI component are locked

Disadvantages of Image based automation tools

Even though majority of tools are image based, the problem in UI based automation approach itself may not be a very good idea if deep, invasive, and maintainable functional tests are required.

o       Changes in the UI layouts severely impact test scripts making them obsolete
o       Relative lack of dependency of test results even with minor changes in UI
o       Limited verification of application components
o       High long term maintenance cost

List of testing/automation tools


Following is a list of major testing tools for mobile application testing:

o       Device Anywhere (All major platforms)
o       Perfecto Mobile (All major platforms)
o       Eggplant (IOS, Android, BlackBerry)
o       SilkMobile (IOS, Android, BlackBerry)
o       JamoSolutions (IOS, Android, BlackBerry, WindowsMobile)
o       Test Quest (Android, BlackBerry, Symbian, WindowsMobile)
o       UIAutomation Framework (IOS)
o       Robotium(Android)
o       FoneMonkey (Android, IOS)
o       MonkeyRunner(Android)
o       Monkey (Android)
o       SeeTestStudio (IOS, Android, BlackBerry, WindowsMobile, Symbian) 

The list is ever growing and there might be special tools for special cases. Please add to the list if you wish to suggest.



Friday, 7 September 2012

Automating Android Browser Testing Using Android WebDriver


If your web application supports android platform, then you need to perform functional verification of your application from the android platform. Just like for a desktop appication you can use the WebDrivers Android avatar to perform automated end-to-end tests that ensure your site works correctly when viewed from the Android browser. The advantage of using the Android WebDriver is that it supports all the core WebDriver APIs plus it also provides mobile specific features, such as finger scroll and long presses and also supports the HTML5 features, such as local storage, session storage, and application cache. Native events are used by the Android Webdriver to interact with the web page which is rendered using the WebView, which is the component used for rendering web pages by the android browser.

Now when we know the basics of the Android WebDriver let us start the installation process. The first question is, What are the pre-requisites for isntalling the Android WebDriver? The setup and exercises that we will be doing require the following:

1) Eclipse with ADT
2) Android SDK installed
3) Android WebDriver
4) Android device emulators or real android devices

Once you have the above requirements ready then the following steps need to be followed to setup end-to-end web application testing on Android:

Step 1: Getting the device or the emulator ready


If you have a physical Android device you can connect the device to your machine using a USB. But if you do not have a physical Android device you can use the Android emulators. The steps to setup the emulator are as follows:

-- Create an Android Virtual Device using the command line: You can use the 'AVD Manager.exe' in the Android SDK to launch a GUI which helps in creating an Android Virtual Device. However, we can perform the same task using the command line as well. Navigate to the directoty ~/android-sdk/tools and run the command:
android create avd -n YourAndroid -t 12 -c 100M 


Select no where you are asked to create a custom hardware profile as it is not required.

After creating the AVD named YourAndroid above, now you can start the emulator using the following command:
emulator -avd YourAndroid &
Once your emulator or the device is ready we are done with the environment setup.

Step 2: Running the tests Using Remote WebDriver Server


The android server is an android application(.apk) which contains an HTTP server. The WebDriver commands from the JUnit tests make a request to the Android Server which sends these requests to the Android WebDriver, which processes and sends an appropriate responce back.

You need to get the serial ID of the device or the emulator to be able to install the Android Server to it. You can run the following command to get the list of available devices:

adb devices

We need to navigate to the ~/android-sdk/platform-tools directory on the command line to execute the above command. The output of the command lists all the available devices, such as:
List of devices attached
emulator-5554   device


Here emulator-5554 is the serial number which needs to be passed to the install command for installing the Android Server. To install the Android Server, you need to first download it from the download link. After downloading the Android Server.apk you run the following command to install the Android Server to the target emulator/device:

adb -s <serialId> -e install -r  android-server.apk 

In our case the SerialID is emulator-5554 and the current server version available on the download page is android-server-2.21.0.apk. Hence we will be running the following command to install the android server to our virtual device having the serial ID emulator-5554:

adb -s emulator-5554 -e install -r android-server-2.21.0.apk

The output of the command would be as follows:

To be able to install applications that are coming from the Android Market/ Playstore you need to configure your device or the emulator. Make sure you have Setting-->Applications-->Unknown Sources checked to be able to install applications from outside the Android Market. Now that the application is installed on the device/emulator we need to start application now. The application can be started from the device GUI or from the command line. The command to start the server application on the device is as follows:
adb -s <serialId> shell am start -a android.intent.action.MAIN -
org.openqa.selenium.android.app/.MainActivity -e debug true

so in our case the command will be:
adb -s emulator-5554 shell am start -a android.intent.action.MAIN -n org.openqa.selenium.android.app/.MainActivity -e debug true

The WebDriver icon on the device GUI shown below can also be used to start the Android Server application:

 After starting the Android Server we need to setup port forwarding so that the requests are sent from the host machine to the emulator. Port forwarding enables the Android Server on the emulator to be available to the host machine over http connection using the URL http://localhost:8080/wd/hub

Run the following command for port forwarding:
adb -s emulator-5554 forward tcp:8080 tcp:8080
Now the emulator will be available at http://localhost:8080/wd/hub.

Step 3: Writing and Executing Tests

Now we can write JUnit tests from eclipse and run them on the Android device. A simple JUnit test is as follows:


import junit.framework.TestCase;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.android.AndroidDriver;

public class TestAndroidWebDriver extends TestCase {
     
      public void testAndroid() throws Exception {

          WebDriver driver = new AndroidDriver();
          driver.get("http://satyamsing.blogspot.com");
          String title = driver.getTitle();
          WebElement e = driver.findElement(By.partialLinkText("Android"));
          e.click();
          e = driver.findElement(By.partialLinkText("Topic 3"));
          e.click();
          assertTrue("Got title: " + title, title.contains("Satyam"));
          driver.quit();

     }

}

Add the above test in your project using the following steps:

1) Create a test project and add selenium and JUnit jars as an external libraries to it
2) Create a class named 'TestAndroidWebDriver' and copy the above code to it
3) Compile and run the JUnit test

You can now see your test running on the Android as follows:

Monday, 27 August 2012

Test Driven Development the Pros and the Cons



Test driven development is a method of software development typically followed in teams adopting the ‘Agile’ software development model. TDD requires the tests to be written first then the functionality being coded for the tests to pass. This implies that each new line of code being written should have an already existing and failed test case for it.
The following figure represents a typical TDD workflow:

First of all, tests that need to pass are identified for the given functionality are identified and coded in the programming language or the framework of choice.
Then the code is written to make the tests written for the functionality pass.
After the code is written for tests then the tests are executed to check if the tests pass. If the tests pass then the new tests are written for new functionality in the application else the code is fixed to make the existing tests pass. This cycle is repeated over the entire application development life-cycle.

A programmer following a TDD approach is likely to refuse to write a new function/line of code until there is an existing test written for it.
By default the source code generated using the TDD methodology is more thoroughly tested and the confidence level is much higher than the model oriented development approach.
TDD ensures that the code is unit tested and the code coverage is 100%.
The chances of scope creep by the developers are also minimized as the pre written tests tend to prevent it.

Over the period of time the adoption rate of TDD has been comparatively low due to various reasons. Some of the reasons include the myths associated with TDD methodology, such as:
Scalability issues: The scalability issues may be due the tests taking a long time to run especially in large projects.
Training needs: In terms of the effort required to train the team on how to write good tests. Developing the test writing skills is an incremental process at it takes its own time to mature. It just cannot happen overnight.
Technical Challenges: There could be technical challenges as well for example writing tests in the multi-threaded environment could be very challenging and may not be the ideal approach to be taken.
Resistance to Change: every new way of working is opposed.
Religious Refactoring of the Code is essential and would be required at every milestone. In case of change of the application logic the tests need to be updated accordingly which might be a big overhead for large and complex projects.

TDD is often thought to be the complete solution for software quality and people perceive that other software testing verticals are automatically taken care of. However, the fact is that TDD may yield a perfectly unit tested code with greater conformance to the specifications, but the need of all the testing verticals is very much there. The functional regression, the performance and the load testing, and the security testing functions will always have to be there to support the over all quality of the product being developed.

So TDD has both its pros and cons, and you nee to weigh them before going TDD.


Friday, 15 June 2012

TaaS in Cloud Computing: What, Why and How

Fundamental Questions about Testing in Cloud

Cloud testing and TaaS are relativly new subjects in software testing community even though there are many technical papers published discussing the cloud architectures, technologies, and models, design, and management. Hence, test engineers and quality assurance managers often encounter many issues and challenges in testing modern clouds and cloud-based applications. Typical questions that come up are as follows:
  • What is cloud testing?
  • What are its special test process and scope, requirements and features?
  • What are different types of cloud testing, environments, and forms do we need to perform?
  • What are the differences between cloud-based software testing and traditional software testing?
  • What are the special requirements and distinct features of cloud-based software testing?
  • What are the special issues, and challenges, and needs for testing in cloud?

What is TaaS and how is it related to Cloud?

Cloud computing is a model enabling an on demand access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal effort. It comes in three major dimensions of service offerings: software-as-a-service (SaaS), platform-as-a service (PaaS), and Infrastructure-as-a service (IaaS). Slowly a fourth dimension is being added to it in the form of Testing as a Service (TaaS) in the cloud. TaaS is an endeavor of bringing the benefits of cloud computing to the world of software testing. By leveraging the benefits offered by cloud computing TaaS can help in following ways:
  • Help reduce the cost of quality in the cloud
  • Reduce the time to create the test environment
  • Reduce full-time resource requirements for testing
  • Reduce test cycle time
  • Enhance parallel and heavy loading
  • Hence decrease time to move to production. 
To understand the relative position of this evolving fourth dimension i.e TaaS, we need to understand the other three dimensions of the cloud offerings i.e SaaS, PaaS and IaaS.
A typical cloud must have several distinct properties: elasticity and scalability, multi-tenancy, self-managed function capabilities, service billing and metering functions, connectivity interfaces and technologies. The highlights of cloud computing can be captured in the following bullets:
  • Clouds can be Private, Hybrid, Public 
  • Cloud services can be as SaaS, PaaS, IaaS 
  • Cloud usage can be Enterprise, Community, Open
The most important characteristics of the cloud offerings are depicted in the picture below: 
The 3 Dimensions of the Cloud
 

Cloud Migration: everyone is thinking if not started yet

Cloud computing provides a cost-effective and customizable means through which scalable computing power and diverse services (such as computer hardware and software resources, networks and computing infrastructures), diverse application services, business processes to personal intelligence and collaboration are delivered as services to large-scale global users whenever and wherever they need.
As a result most of the business in the large and SMB (Small and Medium Business) category have already started or planning to start migration to the cloud. The primary derivatives that are being looked at are the cost effectiveness and lack of effort in scaling up and scaling down the infrastructure.

Over to Cloud: What changes for the Testing Team?

Traditionally people have been working on single server applications and to ensure the quality of such applications several techniques and approaches were developed and practiced over time. Now since the migration to the cloud is becoming general practice hence it is important to understand the Do’s, the Don’t’s, and the How-To of the cloud migration. To perform cloud-based software testing one need to learn testing and measurement activities on a cloud-based environment and how to leverage cloud technologies and solutions. It is important to figure out the relative balance in the efforts/tools/resources that need to be employed to make sure that each of the factors affecting the end-user experience is adequately taken care of without shooting the price of quality. The objectives of the quality team responsible for testing in cloud should now incorporate the following points:
  • To assure the quality of cloud-based applications deployed in a cloud
  • To verify the functional services, business processes, and system performance of the application deployed on the cloud
  • To verify the scalability based on system requirements of the cloud based application
  • To validate software as a service (SaaS) in a cloud environment
  • To validate the software performance and security
  • To check the provided automatic cloud-based functional services, for example auto-provisioned functions
  • To test cloud compatibility and inter-operation capability between SaaS and applications in a cloud infrastructure, for example, checking the APIs of SaaS and their cloud connectivity to others

Types of Cloud Testing

 Cloud Testing can be of following four formats depending on weather you are a cloud vendor:
  • Testing over cloud: It tests cloud-based service applications over clouds, including private, public, and hybrid clouds based on system level application service requirements and specifications. This usually is performed by the cloud-based application system providers.
  • Testing of a cloud: It validates the quality of a cloud from an external view based on the provided cloud specified capabilities and service features. Cloud and SaaS vendors as well as end users are interested in carrying on this type of testing.
  • Testing inside a cloud: It checks the quality of a cloud from an internal view based on the internal infrastructures of a cloud and specified cloud capabilities. Only cloud vendors can perform this type of testing since they have accesses to internal infrastructures and connections between its internal SaaS(s) and automatic capabilities, security, management and monitor.
  • Testing a SaaS in a cloud: it aims to assure the quality of a SaaS in a cloud for it functional and non-functional requirements.

What is new in Cloud Testing?

The new features/areas that are required/come-up in a cloud based testing environment are primarily of the following four types:
  • Cloud-based testing environment/platform: The scalable environment/platform is a new function compared to the traditional fixed, dedicated and pre-configured testing environment.
  • SLAs of the services: In cloud computing, all clouds, SaaS, and applications usually provide diverse services to their end users and customers with well-defined service-level-agreement. Naturally, these agreements will become a part of testing and quality assurance requirements, such as system reliability, availability, security, and performance agreements.
  • Price models and service billing – Since utility computing is one of basic concepts and features in cloud computing, so price models and utility billing becomes basic parts and service for testing as a service. In other words, required computing resources and infrastructures (including tools), and testing task services will be charged based on pre-defined cost models.
  • Large-scale cloud-based data and traffic simulation - Applying and simulating large scale online user accesses and traffic data (or messages) in connectivity interfaces is necessary in cloud testing, particularly in system-level function validation and performance testing.

TaaS Details

TaaS is the enabling of the static/dynamic on-demand testing services in/on/over clouds for the third-parties at any time and all time. One of the primary objectives is to reduce the IT budget of businesses to focus their core businesses by outsource software testing tasks to a third party using TaaS service model. The TaaS workflow can be divided into several sub-tasks which need to be completed to make the TaaS model working. The sub-tasks in the TaaS workflow as follows:
  • TaaS process management, which offers test project management and process control.
  • QoS requirements management, which supports book keeping and modeling of software testing and QoS requirements, including quality assurance modeling.
  • Test environment service, which provides on-demand test environment services to establish the required virtual (or physical) cloud-based computing resources and infrastructures, as well as the necessary tools.
  • Test solution service, which offers diverse systematic testing solutions (such as, test modeling and test methods), and test-ware generation and management services.
  • Test simulation service, which establishes on-demand test simulation environments with selected facilitates (such as tools), and supports the necessary test data/message generation.
  • On-demand test service, which provides on-demand test execution services based on selected schedules and test wares.
  • Tracking and monitor service, which allows test engineers to track and monitor diverse program behaviors at different levels in/on/over clouds for the testing purpose.
  • TaaS pricing and billing, which enables TaaS vendors to offer customers with selectable testing service contracts based pre-defined pricing models, and billing service.

Tuesday, 8 May 2012

10 skills for developers to focus on in 2012

Software development had a few years of relative calm. But now it’s picking up speed, as HTML5 gains a foothold and Windows 8 threatens to significantly change the Windows development landscape. If you want to stay ahead of the curve, you should consider learning at least a few of these 10 software development skills.

1: Mobile development

If you don’t think it is worth your time to learn mobile development, think again. Global shipments of Android phones in 2011 are almost equal to PC sales. Add in the other big-name mobile devices (iPhones, iPads, and even the “dying” RIM devices), and what you see is that mobile devices now dwarf PCs in sales. What does this mean? If you make your living from software that can run only on a PC (which includes Web sites that don’t work or are hard to use on mobile devices), now is the time to learn mobile development.

2: NoSQL

I appreciate a well-designed relational database schema as much as the next person, but they just are not appropriate for every project. We’ve been using them even when they aren’t the best tool because the alternatives haven’t been great. The last few years have seen the introduction of a wide variety of NoSQL database systems. And now that major service vendors (like Amazon and Microsoft) support NoSQL as well, there is no technical limitation on their use. Are they right for every project? No. Are they going to replace traditional databases? In some projects, and for some developers, definitely. This is the year to learn how to use them, as they will only become more prevalent in the year to follow.

3: Unit testing

We’ve seen unit testing go from being, “Oh, that’s neat” to being a best practice in the industry. And with the increasing use of dynamic languages, unit testing is becoming more and more important. A wide variety of tools and frameworks are available for unit testing. If you do not know how to do it, now is the time to learn. This is the year where it goes from “resume enhancement” to “resume requirement.”

4: Python or Ruby

Not every project is a good fit for a dynamic language, but a lot of projects are better done in them. PHP has been a winner in the industry for some time, but Python and Ruby are now being taken seriously as well. Strong arguments can be made for Ruby + Rails (or Ruby + Sinatra) or Python + Django as excellent platforms for Web development, and Python has long been a favorite for “utility” work. Learning Python or Ruby in addition to your existing skillset gives you a useful alternative and a better way to get certain projects done.

5: HTML5

HTML5 is quickly pulling away from the station. The impending release of IE 10 is the last piece of the puzzle to make the full power of HTML5 available to most users (those not stuck with IE 6 or IE 8). Learning HTML5 now positions you to be on the forefront of the next generation of applications. Oh, and most mobile devices already have excellent support for it, so it is a great way to get into mobile development too. And don’t forget: HTML5 is also one route for UI definitions in Windows 8!

6: Windows 8

Windows 8 should be released sometime in 2012, unless the schedule slips badly. While Windows 8 may very well get off to a slow start, being the top dog in an app store is often based on being the first dog in the race. The first mover advantage is huge. It is better to be in the Windows 8 app store at launch time than to take a wait-and-see approach. Even if Windows 8 sales disappoint, it’s better to be the only fish in a small pond than a fish of any size in a big pond, as recent app sales numbers have shown.

7: RESTful Web services

While I personally prefer the convenience and ease of working with SOAP in the confines of Visual Studio, REST is booming. Even Microsoft is starting to embrace it with OData. JSON really was the final straw on this matter, relegating SOAP to be for server-to-server work only. Unless your applications can run in isolation, not knowing REST is going to hold you back, as of 2012.

8: JavaScript

Before the Windows 8 Developer Preview, it was easy for non-Web developers to look at JavaScript as a Web-only language. No more! JavaScript is now a first-class citizen for native desktop and tablet development, thanks to the Metro UI and WinRT API in Windows 8. XAML + C# or VB.NET may be a good way for you to get things done, but if you want to maximize what you can get out of your knowledge, HTML5 and JavaScript are the best bet. They give you Web andMetro/WinRT, and you can also use them for some of the cross-platform mobile systems out there, like Appcelerator’s Titanium product.

9: jQuery

If you are going to do any kind of Web development where you are working directly with HTML, jQuery is becoming a must-know skill. While there are plenty of credible alternatives, jQuery is quickly turning into the de facto tool for rich UIs with HTML.

10: User experience

Other than getting that first mover advantage in new app stores, there is little to differentiate many applications on a feature basis; it’s a crowded field. User experience, on the other hand, is a different story. Creating a great user experience is not easy; it starts before anyone even downloads your application and continues through to the uninstall process. In the age of instant $0.99 and free app downloads, and ad-supported Web apps, the barriers to switching to another application are mighty low. If your user experience is poor, do not expect much business.
Reference: (Tech Republik)

Monday, 7 May 2012

Reading a Soap Response and Passing Values to Other Requests in JMeter

As per the SOA architecture Web services are often designed that one of them is a consumer of the other. This often leads to scenarios where the output of one service is fed to the other to complete an overall business flow.

The provider and the consumer service can be loaded simultaneously by JMeter and performance of the system can be measured under the expected load.

Let us create an example where we will be passing the output value from the response of the first request to the input SOAP request of the second web service method.


Step 1: Create a test plan

Step 2: Add a Thread Group

Step 3: Add a User Defined Variable

Step 4: Add a WebService Request

Step 5: Add an XPath extractor

Step 6: Add a BeanShell PostProcessor

Step 7: Add Second WebService Request

Step 8: Add View Results Tree


Step 1: Create a test plan

 

In the default test plan that appears on opening the JMeter UI (from JMeter.bat in the bin directory) set the option of 'Functional Test Mode' as true. You may also want to add user defined variables. However for this example I will not be using any user defined variable.


Step 2: Add a Thread Group


Add a Thread Group element by right clicking the test plan node and selecting 'Add-->Threads (Users)-->Thread Group'
Add Thread Group


Set the number of threads in the provided section and set the Ramp-up period as zero if you want all the users to be loaded without any lime lag. You can put a time lag between each user request by using the formula: Time Lag=Time/No. of users
You set the loop count to specify how many times do you want each user to hit the web service. If you want to continue loading the web service then you can check the 'Forever' option on the Thread Group element UI.
You can also schedule the load tests by using the 'Scheduler' option provided on the 'Thread Group' UI.


Step 3: Add a User Defined Variable



Right Click on the Thread Group element and select 'Add --> Config Element --> User Defined Variables' to add the 'User Defined Variables' element. On the user defined variables UI screen click on the 'Add' button. Set the 'Name' of the variables and keep the value as blank.

Add User Defined Variable




Step 4: Add a WebService Request


Right click on the 'Thread Group' element and select ' Add --> Sampler --> WebService (SOAP) Request'

Add Web Service Request

Load the WSDL after putting the wsdl in the 'WSDL URL' text box and select the target web method from the 'Web Methods' drop down. configure the server, IP, port, and path.
Copy the SOAP request XML in the 'Soap/XML-RPC Data' text box you provide the path of the file  in the 'Filename' section.


Step 5: Add an XPath extractor


Right click on the 'Thread Group' element and select ' Add -->Post Processors --> XPath Extractor'

Add XPath Extractor

In the Refrence Name section enter the name of the variable that you created in the 'User Defined Variable section'. Enter the XPath query to point to the value which needs to be picked from the XML response. If the value 'TARGET' needs to be picked from the response XML of the format: <ABCD>TARGET</ABCD> then the following XPath query should work:

//*[local-name()='ABCD']/text()

Step 6: Add a BeanShell PostProcessor


Right click on the 'Thread Group' element and select ' Add --> Post Processors --> BeanShell Postpocessor'

Add BeanShell PostProcessor

You can print the value of the user defined variable that you created above using the below script written for bean shell:

print("Beanshell processing  SOAP response");
print("ACQIDD" +${ACQIDD} );


Here the name of the variable created is "ACQIDD" and the value is fetched from ${ACQIDD}.
You can see the value of the variable printed on the command screen that was used to launch the JMeter UI.

Step 7: Add Second WebService Request


Add another webservice request to your thread group element and pass the value of the user defined variable to the input request using the following format:
<value>${ACQIDD}</value>


Step 8: Add View Results Tree


Add a result tree element to the thread group to view the results of the web service requests. After adding the element you can save and run the test plan to see that the output of one SOAP request is read to extract the required value. This value is then passed to another SOAP request. The result tree looks something like this:
Results Tree