Tutorials

Leading High Performance Teams

Diane Brescher
Building and maintaining effective teams is the mark of any effective leader. Making a team functional and cohesive requires levels of courage and leadership particularly when working with individuals who do not report to you. In this interactive, experiential workshop, you will have the opportunity to explore what makes teams successful with the emphasis on leadership actions in building and maintaining teams. You’ll realize what makes teams functional vs. dysfunctional. Core team concepts will be discussed such as the five stages of team development from forming through adjourning along with the leadership actions at each stage. To build cohesive, high performing teams, you will learn how to apply a model of five leadership behaviors focused on trust, conflict, commitment, accountability and results. Through small group and class exercises, you’ll be able to apply these concepts and create your own leadership action plan.

Learning Objectives:

  • Describe what successful teams look like
  • Outline the different stages of team development
  • Define your team’s stage and how to become a high performing team
  • Follow a model to help build functional teams
  • Influence team members over whom you have no direct authority

Tutorial Outline:

  • Team Basics:
    • Biggest team challenge
    • Teams vs. Groups
    • Group Exercise: Best and worst of team experience
    • Group Exercise: Team building exercise
    • Debrief
  • Stages of Team Development:
    • What is the role of the leader in building and maintaining teams?
    • Influencing and leading others whom you may not have direct control
    • Stages of team development model
    • Class Exercise: Forming, Storming, Norming, Performing, Adjourning
    • Debrief: Core leadership actions at each stage
  • Behavior Model
    • Review of Patrick Lencioni Model: Building of Cohesive Teams
    • Group Exercise: Elements to build the five behaviors
    • Debrief: Actions to create cohesive, functional teams
  • Action Plan
    • Individual leadership action plan

Agile Testing: How Lean Can You Get?

Clyneice Chaney

QA and testing budgets have experienced unprecedented and worrying climbs while still struggling to meet increased demands for efficiency, more and quicker test results, and good product quality. To make testing quicker and leaner, we must consider compared to what. Even though the techniques covered in this tutorial provide leaner, more rapid testing, the gains experienced will be relative to your current state of effectiveness and how effective you are at implementing the techniques. Leaner, agile testing is an overarching process of understanding what is to be achieved within the test project and maximizing the value of given time. It involves cutting out anything that isn’t necessary and reconceiving testing as a process of inquiry, instead of a clerical task. This tutorial provides an approach that testing organizations can use to optimize the testing processes in their organizations. We know that there are many challenges facing testing groups today, and this tutorial provides potential solutions to those challenges.

Learning Objectives:

  • Gain understanding of lean concepts in the context of testing
  • Be introduced to leaner testing strategies
  • Practice leaner testing techniques for effective lean strategy

Tutorial Outline:

  • Agile testing and lean concepts
  • Testing Efficiency: Are you lean and efficient?
    • Techniques for evaluating testing efficiency and effectiveness
  • Waste Management- Getting rid of testing waste
    • Test Strategy
    • Test Design
    • Test Report

Mastering BDD Test Automation

Eran Kinsbruner
Being successful in continuous test automation requires both testers and feature teams to be “masters” in the software delivery processes, tools, and test automation authoring practices. Many organizations are choosing Behavior Driven Development (BDD) and Acceptance Test Driven Development (ATDD) as their software delivery practice of choice to bridge the gaps and overcome challenges in people skills, processes, and tools. In this tutorial, you will learn the basics of BDD test automation and how to build a BDD project that can be expanded as the project matures or changes. Using BDD practices, you will learn how to define the most accurate test automation coverage for native mobile, web, and responsive sites for continuous regression testing. You’ll experience first-hand a working BDD project using an open-source framework (Quantum) being built for mobile native apps as well as responsive sites and executed through a cloud based solution.

Learning Objectives:

  • What is BDD and its key advantages and disadvantages.
  • What are the material differences in test automation between Mobile, Web and Responsive apps.
  • How to build a BDD project that is based on a page object model that runs in parallel on various platforms.

Tutorial Outline:

  • An Intro to BDD
    • What is driving a shift to such software development practice
    • Who are the stakeholders that make BDD process work
    • What are the differences between BDD and formal software development life cycle
  • BDD 101
    • What are feature files, step definition, scenarios and a BDD project structure (Gherkin)
    • Working with mobile and web objects and page object model within BDD
    • Available tools and frameworks in the market that supports BDD (e.g. Serenity BDD, Cucumber, Protractor, and more)
  • BDD in the Cloud (quantum project reference)
    • Configure a BDD project for responsive web and mobile apps to run in parallel in the cloud
    • Leverage reporting to maximize quality productivity

From Ancient Greeks to Modern Geeks: Focusing on Data Privacy

Craig Laufer
Privacy is not a new issue. It was formally debated in Ancient Greece, and it is still debated today. Mention privacy in the context of test data and be prepared for a variety of reactions. The two main camps are (A) “The best testing requires production data.” and (B) “Production data should never be used in test environments.” In this tutorial, Craig Laufer will start with a brief discussion about privacy in general, but quickly turn to a technical view of test data privacy. He will review technologies that help with privacy and discuss the advantages and challenges. Masking has been discussed for the last decade, but it cannot solve all issues. Synthetic data generation gives supreme protection, but at the cost of limited test coverage. New technologies are emerging like blockchain and tokenization as well as hybrids. Join Craig to learn positive ways to move forward today to protect data by combining current technology with governance-driven business practices.

Learning Objectives:

  • Learn from the past from privacy philosophies ranging from Aristotle to Charles Schulz.
  • Review technologies that can help with privacy
  • Delve into an often- overlooked fact about blockchain

Tutorial Outline:

  • Historical perspective on privacy
    • Privacy before computing
    • The tipping point
    • The many perspectives of today
    • The lowly shuffle, part one
  • Identifying danger zones
    • Production data
    • Test data
    • No fly zones
  • Risk management controls
    • Physical
    • Administrative
    • Technical
    • Helps
  • Focusing on tools and techniques
    • Encryption
    • Masking
    • The lowly shuffle, part two
    • Synthetic data generation
    • Upcoming tools
  • One way to put it together
    • An overlooked piece
    • Re-envisioning data
    • A cheap proof of concept
    • The lowly shuffle, part three

Tackling Mobile/IoT Testing Workshop

Costa Avradopoulos
According to the World Quality Report, “over half of organizations (56%) do not have the right process or methods in place to test mobile and IoT applications and 52% don’t have access to required devices.” Gartner states in a recent report, “companies that use traditional testing methods for emerging technologies like mobile/IoT, will fail 90% of the time”. Join Costa to explore the unique challenges of mobility and IoT. We start by covering the test strategy and continue with the top challenge of designing a proper device lab, given thousands of unique devices in the market. Next Costa provides insight into choosing the right devices to optimize test coverage and reduce risks. Then we dive deep into requirements and test cases and how those deviate from traditional methods. Lastly, Costa will explain leveraging existing tools and evaluating automation options. With the knowledge gained, you’ll keep your team current with the faster pace of mobility, IoT and other emerging technologies.

Learning Objectives:

  • Take a deep dive into learning what really makes Mobile and IoT different
  • Creating a proper Mobile/IoT test strategy
  • Choosing devices for testing, including a hands-on exercise for building a mobile lab
  • Inspecting & reviewing requirements for technology nuances
  • Writing test cases for mobile/IoT to ensure proper scope & test coverage
  • Exploring pros/cons of various mobile/IoT testing tools

NOTE: This tutorial has hands-on activities, exercises, discussions, and demos. Bring a mobile device, Smartphone, or tablet.

Tutorial Outline:

  • Introduction to the Mobile/IoT World & Current Challenges
  • Mobile/IoT Test Strategy
  • Mobile/IoT Testing Environments
  • Breakout Exercise: Building a World Class Test Lab
  • Mobile/IoT Requirements
  • Mobile/IoT Testing Scenarios & Test Cases
  • Mobile/IoT Testing Methods

Improving your Leadership Skills with Open Space Concepts

Mike Kaufman and Joanne Stone
Have you been to an open space? Open Space is a way to self-organize meetings around what is most important to people. The underlying concepts of open space are simple, yet powerful. It brings people together around a topic or need. They create their own agenda, form groups to brainstorm, debate, and discuss the topic. The overall Open Space is opened and held by a single facilitator. Truth is, open space facilitators are very similar to servant leaders. A servant leader puts the needs of others first with an aim to achieve results for their organization. Each holds a space that promotes safety, collaboration, empowerment with little direction and very simple laws. Join Mike and Joanne as they invite you to explore how to apply the open space concepts to improve your leadership skills with your testing teams or Agile teams.

Learning Objectives:

  • What is open space and holding space and how to hold space with their teams
  • Why holding space is similar to servant leadership
  • What are the benefits of holding space
  • Experience holding space to grow leadership skills

Tutorial Outline:

  • Holding Space Principles
    • Activity – Selecting principles to discuss using the TAO of Holding Space
    • Learning outcome – Awareness of the many principles
    • Learning outcome – Using the TAO to improve holding space
    • Discussion – Characteristics for open space facilitators and servant leaders
    • Discussion – Benefits to ourselves and the teams
    • Discussion – Personal Journeys of Facilitators using open space concepts and tools
  • Facilitator Experience
    • Activity – In small groups, facilitators holds the space for group dialogue to occur
    • Learning outcome – Facilitators’ feeling about their holding space experience
    • Discussion – Challenges with holding space and tools that help
  • Outcomes
    • Wrap Up – Were participants’ outcomes all covered
    • Discussion – Tutorial take-aways

Quality at Speed: Maximizing Efficiency with an Agile Testing Methodology

Jeff Van Fleet and Julie Hagan
As the shift to Agile has helped project teams deliver software releases faster than ever, it’s putting tremendous strain on testing and QA teams to deliver quality results in less time. Backlogs grow, pressure mounts, and costly defects leak into production—all because QA cannot keep up. The problem isn’t the testers, it’s the methodology. All too often, QA teams fail to amend their approach when they shift to Agile—leading them to implement Waterfall solutions in a decidedly non-Waterfall world to oftentimes disastrous results. And when those approaches fail, teams often abandon their methodologies completely—opting instead for an ad-hoc approach that’s unmeasured, inconsistent, and uncontrollable. In this tutorial, we’ll discuss the benefits of applying a measurable, metrics-based testing methodology that’s built for Agile from the ground up.

Learning Objectives:

  • Understanding fundamental principles of Agile testing
  • Establishing a quantitative test plan
  • Preventing QA from bottlenecking sprints
  • Testing early work products to improve quality
  • Assessing Agile testing maturity and building a roadmap for continuous improvement

Tutorial Outline:

  • Introduction to Agile
    • Brief overview of Agile Methodology/Scrum
  • Principles of Agile testing
    • In-Sprint Testing and RTM
    • Hardening and integration
  • Test Planning and measurement
    • Adapting test planning to a sprint-based world
    • Basic and advanced Agile metrics
  • Shift Left
    • Implementing user story inspections, code analysis, etc.
  • Moving to Test Automation for Regression
    • Considerations for test automation
  • Assessing Agile Testing Maturity
    • Conducting a self-assessment and actioning on findings

Test Automation for Manual Testers

Jim Trentadue
Test Automation is one of the most talked about topics in the software testing industry today. Everyone wants it, everyone wants to be involved with it, yet almost 70% of the QA industry is testing manually. Test leaders must respond to automation – why they aren’t doing more or why they aren’t doing it at all.
Come join this ‘first of its kind’ tutorial that is geared to those professionals that do not have working knowledge with test automation. You are guaranteed to have a strong knowledge base when leaving the tutorial. Whether you are a Test Leader, Test Analyst, or supporting IT professional, this tutorial is perfect for you. All Test Automation solutions are built on three key principles – accessibility, modularity, and reliability. This tutorial is tool-agnostic, but we’ll talk about the tools that may be in your organization.

Learning Objectives

  • Understand, learn and apply the ACCESSBILITY principle
  • Understand, learn and apply the MODULARITY principle
  • Understand, learn and apply the RELIABILITY principle

Tutorial Outline:

  • Automation trends & errors:
    • Test Automation: Past, Present & Future
    • Frameworks used in the industry
    • Review of key failures
  • Automation basics definition of key phrases:
    • Defining accessibility and how to interaction with objects
    • Defining modularity and how test case structures are critical to effectiveness
    • Defining reliability and how to differentiate and isolate the issue
  • Object accessibility:
    • Principles of object accessibility in an application under test
    • Identify static vs. dynamic attributes for elements
    • Navigate through an object’s tree structure
  • Development of Test Case modularity:
    • Review ground rules for structuring modular test cases
    • Class Exercise: Build a test case abstract from a supplied manual test case
    • Present the revised test case to the group, broken out into groups of four
    • Review and provide feedback to all as a group
    • Debrief: Automation benefits and insertions into the modular case(s)
  • Analysis of Test Execution runs for reliability:
    • Most common errors for the application under test and test scripts
    • Class Exercise: Identification of the error source
    • Debrief: Answer review with corrective measures
  • How manual testers can break into automation?
    • Plan your automation activities into the testing process
    • Prepare data for driving your automated tests
    • Organize your object repository in a structured manor
    • Make test steps more actionable
    • Add error-handling into the testing workflow
    • Debug techniques for isolating errors
    • Read and interpret report results

Methodology for Outcomes-Based Testing Requirements

Clareice Chaney and Clyneice Chaney

Outcome is one of the new buzz words and is central to aligning organizational or project visions and goals to measurable expectations. Testing organizations, as service entities, need to provide information about both product readiness as well as service quality. Many testing organizations struggle with how best to define both their service and their outcomes within a testing relationship. Testing as a Service (TaaS) organizations in particular are interested in better alignment of their work to outputs.

Join us as we address key questions of interest. What are good outcomes for a testing organization? Are outcomes in agile development projects differ from outcomes in more traditional projects? How can outcomes be used for optimum benefit? What, how, and why should you define outcomes? Are outcomes related to service level agreements? Whether you are an agile team member, an outsourced testing team, or manager of a testing organization, this tutorial can provide practical and valuable information and examples for viable testing service outcomes and their use in today’s world.

Learning Objectives:

  • Be able to define viable testing outcomes for projects, products, testing relationships, and processes
  • Be able to utilize testing related outcomes for a variety of situations such as service levels, reporting and outsourcing
  • Utilize a methodology for drafting outcomes-based requirement documents

Tutorial Outline:

  • Section 1: Outcomes as a Means of Measuring Performance
    • What they are as opposed to other measures/metrics
    • Why you need them
    • How: Approaches/Methodology for Developing and Documenting
  • Section 2: Useful Software Testing Outcomes
    • Standard, traditional testing approach outcomes
    • Agile outcomes
  • Section 3: Applying Testing Outcomes
    • Outsourced Testing
    • Service Levels Agreements
    • Testing Organization Performance
  • Section 4: Monitoring and Reporting Testing Outcomes
    • Standard, traditional testing approach outcomes
    • Agile outcomes

Connecting your Open Source Tools and Creating an Integrated Results Dashboard

Jennifer Bonine
With all the open source tools available on the market it can be overwhelming as to which ones might meet your needs and which ones will work best in your environment to create a high performing team. Join Jennifer as she explains the relationship of the DevOps cycle, your environment, and how a hub and spoke model can link all your different data sets and tools together. She identifies opportunities for applying test data analytics across the engineering and test landscape, ranging from high-value test cases to dynamically generated regression test suites. She will review ways to collaborate and show results in a way that clearly shows progress and how to present a visual dashboard to your leadership and stakeholders in the organization. Most importantly, Jennifer provides tips to improve your skillset—and your mindset—so you will eagerly embrace the application of test data analytics in your test engineering practices for your organizations.

Learning Objectives:

  • Understand tool optimization using open source options
  • Learn how to evaluate tools for your environment
  • Explore dashboard samples to integrate data sources

Tutorial Outline:

  • Data
    • History of data
    • Using “Big Data” in decisions
    • Exploring data activity
  • Tool Optimization
    • Open source options
    • Tools evaluation
  • Dashboards
    • Hub and spoke model
    • Data sources integration
    • Data analytics