Test Strategies (Unit IV)
- Software testing is a process which involves a set of activities that can be planned in advance and conducted systematically.
- Testing is required in order to show that the software dees what it is intended to do and to discover defects in the software if any, before it is put into use.
- Software testing does the following in the process of developing good quality software:
- It demonstrates to the developer and the customer that the software meets its requirements
- It helps to identify undesirable system behavior such as system crashes, unwanted interactions with other systems, incorrect computations and data corruption that may occur during the execution of software
- Checks whether the software being developed meets the requirements specified by the user
- Establishes the confidence on software in order to ensure that the software is good enough for its intended use
- Verification is a kind of test that answers the question: Are we building the product righr?
- It verifies that the software being developed meets the stated functional and non-functional requirements specified in the Requirements Specification Document.
- This checking process begins as soon as the requirements are identified and documented and continue throughout all stages of the development process.
- Validation is a more generic term in software testing.
- It helps in answering the question! "Are we developing the right product?"
- It ensures that the software beeing developed will meet all the requirements (expectations) of the customer.
- It goes beyond checking conformance with the specification to demonstrate that the software will perform what the customer expects it to do.
- System requinements
- Design models
- Source code and
- Proposed test results
A strategic approach to software testing is a well-planned, systematic method to ensure that testing is effective, efficient, and aligned with project goals. It focuses on quality assurance throughout the software development life cycle (SDLC).
The key elements of a strategic testing approach include:
-
Define Testing Objectives:
Clearly state what the testing aims to achieve — such as detecting defects, ensuring performance, or verifying compliance with requirements. -
Develop a Test Plan:
Outline the scope, resources, schedule, tools, and responsibilities for testing activities. -
Select the Appropriate Testing Levels:
Include different levels of testing such as:- Unit Testing: Testing individual components.
- Integration Testing: Checking interaction between modules.
- System Testing: Evaluating the complete system’s functionality.
- Acceptance Testing: Ensuring the product meets user needs.
-
Choose Testing Types and Techniques:
Apply suitable testing methods (functional, performance, security, usability, etc.) and techniques (black-box, white-box, or gray-box). -
Define Entry and Exit Criteria:
Specify the conditions that must be met before starting and finishing testing phases. -
Automate Where Possible:
Use automation tools to improve efficiency, especially for regression and performance tests. -
Measure and Monitor:
Track metrics like defect density, test coverage, and test execution rate to evaluate progress and quality. -
Continuous Improvement:
Review results, identify weaknesses, and refine testing processes for future projects.
In short, a strategic approach ensures testing is proactive, structured, and quality-focused, leading to a reliable and maintainable software product.
Test Strategies for Conventional Software
Test strategies for conventional software (developed using traditional models like the Waterfall or V-Model) focus on structured, phase-wise testing that aligns with the software development life cycle.
These strategies ensure systematic defect detection and quality assurance.
The main test strategies include:
-
Unit Testing:
- Conducted by developers.
- Tests individual components or modules for correctness.
- Uses techniques like white-box testing.
-
Integration Testing:
- Tests the interfaces and interaction between integrated modules.
- Common approaches:
- Top-Down Integration
- Bottom-Up Integration
- Big-Bang Integration
-
System Testing:
- Performed after all modules are integrated.
- Verifies that the complete system meets the specified requirements.
- Includes functional and non-functional testing (performance, reliability, etc.).
-
Acceptance Testing:
- Conducted by end users or clients.
- Ensures the system satisfies business needs and is ready for deployment.
- Types: Alpha testing (in-house) and Beta testing (user environment).
-
Regression Testing:
- Re-execution of previously passed tests after changes are made.
- Ensures that new code has not introduced defects in existing functionality.
-
Validation and Verification Testing:
- Verification: “Are we building the product right?” — focuses on design and development correctness.
- Validation: “Are we building the right product?” — ensures final software meets user needs.
-
Performance and Stress Testing:
- Evaluates the system’s behavior under load, stress, or high-traffic conditions.
In summary, a conventional software test strategy follows a structured, sequential approach, ensuring each phase of development is thoroughly tested before moving to the next — maintaining stability, reliability, and quality in the final product.
Black-Box and White-Box testing
Black-Box Testing and White-Box Testing are two fundamental approaches of software testing that differ mainly in their focus and knowledge of the system’s internal structure.
Black-Box Testing:
Definition:
Black-box testing is a testing technique where the tester does not know the internal workings of the software. The focus is on checking whether the software behaves as expected based on input and output.
Key Points:
- Tests the functionality of the software.
- Testers do not look at the code or internal logic.
- Inputs are given and outputs are verified against expected results.
Techniques Used:
- Equivalence Partitioning
- Boundary Value Analysis
- Decision Table Testing
- State Transition Testing
Advantages:
- Tester’s view is similar to the end-user’s perspective.
- Useful for validation testing (does the system meet requirements?).
- No programming knowledge required.
Disadvantages:
- Cannot find hidden errors in the code structure.
- Limited coverage since internal paths are not tested.
Example:
If you enter a username and password in a login page, the tester checks if correct inputs allow access and wrong ones don’t — without looking at how authentication is coded.
White-Box Testing:
Definition:
White-box testing (also known as Structural or Glass-box testing) examines the internal structure and logic of the code. The tester knows how the system works and designs tests based on the code.
Key Points:
- Tests the internal logic, loops, and conditions in the code.
- Requires programming knowledge.
- Focuses on code coverage (statement, branch, path coverage).
Techniques Used:
- Statement Coverage
- Branch Coverage
- Path Coverage
- Condition Coverage
Advantages:
- Helps find hidden errors in logic or code structure.
- Enables optimization of code paths.
- Ensures complete coverage of possible execution paths.
Disadvantages:
- Time-consuming and requires technical expertise.
- Not practical for large systems with complex code.
Example:
A tester examines the source code of a login function to ensure all branches (correct login, wrong password, blank input) are properly handled.
Comparison Summary:
Aspect | Black-Box Testing | White-Box Testing |
---|---|---|
Focus | Functionality | Code structure |
Tester’s Knowledge | No knowledge of code | Full knowledge of code |
Type | Functional testing | Structural testing |
Techniques | Equivalence partitioning, boundary value analysis | Statement, branch, path coverage |
Used In | Validation testing | Verification testing |
Example | Testing login form with inputs | Testing logic inside login code |
Validation Testing
Definition:
Validation Testing is the process of checking whether the developed software meets the user’s needs and requirements.
It answers the question:
“Are we building the right product?”
Purpose:
The goal of validation testing is to ensure that the final software system satisfies the intended use and performs as expected in the real-world environment.
Key Features:
- Focuses on external behavior of the system.
- Ensures the software conforms to business requirements.
- Conducted after verification activities.
- Usually performed at the end of the development cycle or during acceptance testing.
Techniques Used in Validation Testing:
- Functional Testing – Checks if the system’s functions work as specified.
- System Testing – Tests the complete integrated system as a whole.
- Acceptance Testing – Conducted by the client or end-users to confirm the software meets their expectations.
- User Interface (UI) Testing – Ensures that the software is user-friendly and behaves as expected.
Steps in Validation Testing:
- Review Requirements – Understand what the user expects.
- Prepare Test Cases – Based on functional specifications.
- Execute Tests – Run the software in real or simulated environments.
- Compare Results – Check if actual results match expected outcomes.
- Report Issues – Log any deviations or defects found.
Example for Validation Testing:
If the requirement states that “The system should allow users to reset their password via email,”
then validation testing will:
- Test if the “Forgot Password” feature exists,
- Check if an email with a reset link is sent,
- Confirm the link correctly allows password change.
Difference Between Verification and Validation:
Aspect | Verification | Validation |
---|---|---|
Focus | Ensures the product is built correctly | Ensures the right product is built |
Question Answered | “Are we building the product right?” | “Are we building the right product?” |
Based On | Design and specifications | User needs and requirements |
Performed By | Developers and testers | End-users or QA team |
Examples | Reviews, walkthroughs, inspections | System testing, acceptance testing |
Here is a Video Lecture on Software Testing
Questions and Answers on SE (UNIT 3)
FAQ on Software Engineering (Unit 3)
- The statement "Designing is not Coding and Coding is not Designing" is a fundamental principle in software engineering that highlights the distinct and sequential nature of these two activities.
- They are separate phases in the software development lifecycle, each requiring a different skill set and mindset.
Designing: The Blueprint Phase 🗺️
- Designing is a high-level, abstract process. It involves the intellectual work of creating a plan or blueprint for the software.
- This phase happens before any code is written and it focuses on the overall structure and architecture of the system.
- Designers analyze the requirements, user needs, and business goals to determine the best way to solve a problem. They decide on the system's architecture, its major components, and how they will interact.
- Design involves making high-level decisions without getting bogged down in the implementation details. It's about thinking in terms of modules, interfaces, and data flow, not specific lines of code.
- Coding, also known as implementation, is the low-level, concrete process of translating the design specifications into executable source code.
- Coding follows designing and it focuses on the detailed mechanics of building the software.
- Coders take the design documents and write the actual code in a specific programming language. They focus on the syntax, algorithms, data structures, and the logic required to make the system function as designed.
Questions and Answers (Q&A) on SE (UNIT 5)
FAQ on Software Engineering (Unit 5)
In software engineering, risk refers to the possibility of an unwanted event occurring that can negatively affect the success of a software project.
- It is an uncertain event or condition that, if it occurs, has a positive or negative impact on project objectives like time, cost, quality, or performance.
- Risks can arise from many areas such as technology, people, processes, requirements, environment, or business.
👉 Examples of risks involved in software development:
- Key developer leaving the project (people risk)
- Technology newly introduced, not working as expected (technical risk)
- Sudden changes in customer requirements (requirement risk)
- Delay in hardware/software delivery (scheduled risk)
To deal with risks, software engineers use two main strategies: Reactive and Proactive.
1. Reactive Risk Strategy ("Fix-it-later approach")- In Reactive approach, no major effort is made to identify or plan for risks beforehand. This approach is also called as the crisis management approach.
- Risks are addressed only after they occur in Reactive Risk management that often leads to cost overruns and delays.
👉 Example: A project team that does not anticipate the risk of a developer leaving.
When a developer suddenly quits, the team scrambles to reassign tasks and train a replacement, causing project delays.
- In Proactive strategy of risk management, risks are identified, analyzed, and planned for before they occur.
- The team develops risk management plans, including mitigation and contingency strategies. This minimizes damage and increases project stability.
👉 Example: The project team that anticipates a key developer leave.
A proactive team cross-train other team members and document code thoroughly. When the developer actually resigns, the impact is minimal, and the project continues smoothly.
Question 2:
- In Software Engineering, risk refers to the possibility of an undesirable event that can negatively impact the successful completion of a software project.
- Software risk refers to the probability of loss combined with the consequences of that loss that may occur during the development of software
- Software risks can arise from people, process, technology, business, or external factors.
- Identifying, analyzing, and managing these risks early is crucial to ensure software projects finish on time, within budget, and with good quality.
A risk usually has three main components:
- Uncertainty – The risk may or may not happen.
- Loss – If the risk occurs, it can cause negative effects (cost, time, performance, quality, etc.).
- Impact – The severity of damage it can cause to the project or organization.
👉 Example:
- If requirements are not properly understood (uncertainty), the software may not satisfy user needs (loss), leading to project failure (impact).
Types of Software Risks:
Risks in software development can be categorized in different ways. A widely used classification is:
1. Project Risks
- Related to the environment in which the project is being developed.
- They affect schedule, resources, cost, and people.
- Examples:
- Unrealistic deadlines.
- Inadequate budget allocation.
- Lack of skilled developers/testers.
- Poor communication among team members.
2. Technical Risks
- Related to the technology used in the project.
- They threaten the quality, performance, or functionality of the software.
- Examples:
- Use of new or unproven technology.
- Integration issues with third-party tools or systems.
- Technical complexity not well understood.
- Performance or scalability failures.
3. Business Risks
- Associated with the market or business impact of the software product.
- Examples:
- Product fails to meet user needs.
- Competitors release better software earlier.
- Change in business priorities.
- Customer may cancel the project.
4. Operational Risks
- Risks that affect the daily operation and support of the software.
- Examples:
- Inadequate maintenance plan.
- Poor documentation for users or developers.
- System downtime or data loss after release.
5. External Risks
- Risks beyond the control of the project team.
- Examples:
- Changes in government regulations or legal policies.
- Natural disasters (flood, earthquake).
- Market changes due to new competitors.
6. Schedule Risks
- Directly related to time management of the project.
- Examples:
- Wrong effort estimation.
- Unexpected delays in key tasks.
- Dependency on external vendors or clients causing slippage.
7. Cost Risks
- Related to budget overrun and financial mismanagement.
- Examples:
- Underestimation of total development cost.
- Increase in hardware/software licensing fees.
- Unexpected resource requirement.
Question 3:
Briefly explain the steps involved in risk planning for software development.
Risk planning in software development involves identifying, analyzing, and preparing strategies to manage potential risks that could affect the project. The key steps are:
-
Risk Identification:
List all possible risks that could impact the project — such as technical failures, cost overruns, schedule delays, or resource issues. -
Risk Analysis:
Evaluate each identified risk to determine its likelihood (probability of occurrence) and impact (effect on project objectives). -
Risk Prioritization:
Rank risks based on their severity (a combination of likelihood and impact) to focus on the most critical ones. -
Risk Response Planning:
Develop strategies to handle each major risk. Common strategies include:- Avoidance: Change plans to eliminate the risk.
- Mitigation: Reduce the likelihood or impact.
- Transfer: Shift responsibility (e.g., insurance or outsourcing).
- Acceptance: Acknowledge the risk and prepare contingency plans.
-
Risk Monitoring and Control:
Continuously track identified risks, detect new ones, and update risk plans throughout the project lifecycle.
These steps help ensure that risks are managed proactively, improving the project’s chances of success.
Assignment-2 Questions on Software Engineering
- What is Risk? Explain the reactive and proactive risk strategies with appropriate examples.
- Define Software Risk in detail. What are the different types of Risks involved in developing software?
- Briefly explain the steps involved in risk planning in software development.
- Discuss the RMMM plan in detail.
- What is software quality? Write notes on different quality metrics.
- Describe the software quality dilemma in your own words.
- What are the objectives of testing? Explain the different types of testing.
- What is test case design? What are the various approaches available for test case design?
- What are the guidelines that lead to a successful software testing strategy?
- What is the difference between verification and validation?
- What is meant by integration testing? Explain top down and bottom up integration testing.
- Differentiate between black box and white box testing.
- Explain how Object Oriented software testing is different from conventional software testing.
- Explain unit testing and integration testing with respect to the Object Oriented context.
- What is meant by software reliability? How does reliability affects the quality of software being developed?
- Discuss various metrics available for measuring the quality of software.
- What do you mean by Risk Management? Describe the various methods used for Risk Projection.
- Define software reviews. Also explain formal technical reviews.
- Illustrate the design modeling concepts with an example.
- What are design principles? Explain them in detail.
- Explain about Back Box testing in detail.
- What do you mean by system testing? Explain in detail.
- What are the objectives and guidelines of Formal Technical Reviews.
- Elaborate the measures of Reliability and Availability.
- Depict Software Architecture in detail.
- Explain about Architectural design of software engineering.
- Explain about White Box testing with example.
- What do you mean by system testing? Explain in detail.
- Examine the ISO 9000 Quality standards with respect to SW Quality.
- Explain the role of Software Reviews in developing good quality software.
- How can design patterns be used in designing software? Illustrate with an example.
- What are the key characteristics of good software architecture?
- Outline the concept of the art of debugging.
- How test cases are generated? Illustrate with example.
- Identify the importance of Software Quality Assurance (SQA) in maintaining the quality of software throughout its development.
- Explain how the Capability Maturity Model Integration (CMMI) helps the organization to improve their software development process.
Assignment -1 Q&A on Software Engineering
- Software Engineering is a systematic, disciplined, and quantifiable approach to the development, operation, and maintenance of software. It applies engineering principles to the creation of software, aiming for the production of high-quality, reliable, and cost-effective software products.
- A program is a set of instructions written in a programming language that performs a specific task.
- Software is a much broader term that includes the program itself, along with all associated documentation, data, and configuration files needed for it to operate correctly.
- Software engineering is a science because it relies on fundamental principles and theories.
- It uses scientific methods to analyze problems, model systems, and predict outcomes.
- Concepts like algorithms, data structures, and computational complexity are based on mathematical and logical foundations.
- It involves systematic analysis, experimentation, and the application of proven techniques to ensure predictable and reliable results.
- Requirements Engineering is a systematic process of discovering, documenting, analyzing, and managing software requirements.
- Its significance lies in ensuring that the software to be developed meets the actual needs of its users and stakeholders.
- Analysts, or requirements engineers, often face several significant challenges during the requirements-gathering process:
- Unclear or Conflicting Needs: Different stakeholders may have conflicting requirements, and it's the analyst's job to resolve these disputes.
- Tacit Knowledge: Users might not be able to articulate their needs because they are so familiar with their work. They often struggle to describe what they do and need until they see something concrete.
- Vague Language: Requirements are often written using vague terms like "fast," "easy to use," or "secure." These are subjective and can be interpreted differently by team members.
- Missing Requirements: Sometimes, critical requirements are simply forgotten or assumed, only to be discovered late in the development cycle when they are much more expensive to fix.
The Requirements Engineering process is often iterative, meaning the steps may be repeated and revisited as the project evolves. The core steps are:
Requirements Elicitation: 👂
- This is the process of gathering information from stakeholders.
- It involves understanding their needs, wants, and constraints related to the new software system.
- Techniques like interviews, surveys, workshops, and observation are used to uncover both explicit and implicit requirements.
Requirements Analysis: ðŸ§
- Once requirements are gathered, this step involves examining them to identify conflicts, ambiguities, and inconsistencies.
- The goal is to refine the raw information into a clear, concise, and complete set of requirements.
- This is where the Analysts prioritize and group requirements, as not all of them will have equal importance.
Requirements Specification: ✍️
- This step is about documenting the analyzed requirements in a formal and structured way.
- The primary output is typically a Software Requirements Specification (SRS) document.
- This document serves as a blueprint for the development team and includes details on both functional requirements (what the system should do) and non-functional requirements (how the system should perform).
Requirements Validation: ✅
- Before development begins, the documented requirements must be validated.
- This means checking that they are complete, consistent, and accurate and that they truly meet the stakeholders' needs.
- Validation can involve reviews, prototyping, and generating test cases to ensure the requirements are feasible and testable.
Requirements Management: 🗓️
- This is an ongoing activity throughout the project lifecycle.
- It involves tracking requirements, managing changes, and maintaining traceability between requirements, design, and testing.
- Requirements often change during a project, and this step ensures that any modifications are handled in a systematic and controlled manner to prevent "scope creep" and project deviations.
5. Interpret the following design concepts: Abstraction, Patterns, and Modularity.
Abstraction:
- Abstraction is the process of hiding complex details and showing only the essential features.
- In software design, it allows developers to focus on the high-level design without getting bogged down in implementation details.
- For example, when you use a "Car" object, you can "drive" it without needing to know the complex internal mechanics of the engine.
- Design patterns are reusable solutions to common problems in software design.
- They are not a finished design that can be directly transformed into code but are templates or descriptions for solving a problem that can be adapted to specific situations.
- For example, the Singleton pattern ensures that a class has only one instance.
- Modularity is the concept of dividing a system into discrete, self-contained components called modules.
- Each module can be developed and tested independently before being integrated into the larger system.
- This makes the system easier to understand, develop, and maintain.
- For example, a web application might have separate modules for user authentication, product catalog, and payment processing.
- Discuss the major areas of Applications of Software.
- Distinguish between generic software and customized software. Which one has large share of market? Why?
- What is the need for documentation in Software Engineering.
- Paraphrase User Requirements and System Requirements.
- Paraphrase the importance of software design. Explain the meaning of coupling and cohesion in software design.
- Express your idea on design patterns. How can patterns be used in software design.
Set 3:
- Explain the concept of Software Engineering and its significance in the development of software.
- Analyze the performance of Waterfall model and its limitations.
- What is Requirement Analysis? Explain the steps involved in it with illustration.
- Distinguish between functional and non-functional requirements.
- List the golden rules of User Interface design.
- What are myths in software development? Identify the software myths related to management and practitioners.
- What are the advantages of iterative development? Compare iterative development with incremental development approach.
- Present Software Requirements Document (in IEEE Format).
- Explain the role of user requirements in the Requirements Engineering process.
- Discuss about Object Oriented Analysis and Design.
- Summarize the key principles of Agile Development Model.
- Write notes on Process Assessment.
- Describe five desirable characteristics of a good SRS document.
- Differentiate between monolithic and microservices architecture.
- Justify the statement: "Designing is not Coding and Coding is not Designing"
Case Study on Software Engineering (Batches 17 - 22)
An Online Learning & Examination Platform
Scenario:
A mid-sized EdTech company is building EduConnect, a platform designed for schools, colleges, and professional training institutes. The system will:
- Allow students to enroll in courses, attend live classes, access recorded lectures, and take online tests.
- Enable teachers to upload content, conduct live sessions, track student performance, and evaluate assignments.
- Provide institutions with analytics on student progress, faculty workload, and overall course performance.
- Offer a mobile app for students and a web platform for teachers and administrators.
Key challenges include:
- Ensuring scalability when thousands of students log in simultaneously during exams.
- Preventing malpractice during online tests (cheating, impersonation).
- Handling multimedia content delivery (videos, interactive modules) efficiently.
- Supporting offline access for students with poor internet connectivity.
- Maintaining data privacy and compliance with educational data standards.
Batch 17:
- List the functional and non-functional requirements for EduConnect.
- Which stakeholders should be consulted during requirement gathering, and why?
Batch 18:
- Draw a use case diagram for the online examination process.
- Propose a suitable system architecture for handling live classes and high exam traffic.
- What security mechanisms would you design to prevent cheating in online exams?
Batch 19:
- Which software process model would you select for EduConnect, and why?
- How can continuous integration and deployment (CI/CD) pipelines benefit this project?
Batch 20:
- Propose a testing strategy for EduConnect, covering both the learning modules and exam modules.
- How would you test the system’s ability to handle 10,000 concurrent exam takers?
Batch 21:
- Identify potential risks in building EduConnect (technical, legal, organizational) and suggest mitigations.
- How would you manage version control and collaboration in a team of developers spread across different time zones?
Batch 22:
- How should EduConnect handle platform upgrades without affecting ongoing classes?
- Suggest possible future enhancements (AI proctoring, adaptive learning, gamification, integration with job portals).