Webinar Archives Details

QAI Global Institute is proud to host complimentary webinars focused on a variety of important topics in software quality and testing. Each webinar is presented by an industry expert – providing insights to assist IT, QA, and Software Testing professionals in the field. For those professionals who could not attend the live presentation, the webinars are recorded and made available for later viewing.

Testing the Next Generation of Technologies, IoT, Mobile, and Cloud with Costa Avradopoulos

This webinar covers recent trends/challenges in testing IoT, Mobile, and Cloud applications. Costa discusses the components that go into a proper test strategy, such as building a test lab, test coverage, test data, test management, tools, and automation.
About the speaker

Costa Avradopoulos is a recognized thought leader with over 25 years of experience in software development, from requirements to delivery and process engineering. His former roles include Developer, Tester, Automation Engineer, QA Director, Product Manager, and CTO. He has been responsible for systems of over 100 million users, in verticals such as Telecommunications, Financial Services, Wireless, Transportation, DSD, Digital Imaging, Media & Entertainment, and Retail. A visionary innovator, Costa has devoted much of his career to mobility, culminating with a breakthrough invention, a patented mobile technology. Costa is a frequent speaker and is writing a book due to be published soon, titled “Winning Mobile Strategies – Bridging the Quality Gap”. Costa is currently CEO of Avracom and formerly the Mobile Testing Practice Leader for Capgemini. Costa is certified Six Sigma and TPI.

The Tao of How: An Ancient Path for Future Leaders with Jeff Dalton

Building and leading “Large Agile” presents its own set of challenges beyond adopting scrum, XP, or even SAFe. As more large organizations are embracing agile, the “culturalization” of values and behaviors at scale has become the most difficult task facing current and future Agile leaders. Solving this problem doesn’t require a new process, a complex model, or a lot of high-cost consulting. “The Tao of How” explores the value and components of agile capability, and why understanding the “how” is a necessary pre-requisite to succeed for any leader an organization on the path to adopting, transforming, and ultimately mastering Agility.
About the speaker

Jeff Dalton is an author and leadership coach that is Chief Evangelist at AgileCxO.org, a Research and Development organization that studies agile leadership at self-organizing companies. He is author of “The Guide to Scrum and CMMI: Improving Agile Performance with CMMI,” where he advocates a disciplined and robust model for succeeding with scalable agility. He is also author of the Agile Performance Holarchy, a model for scaling self-organization across the enterprise. In his spare time, he builds experimental aircraft and plays bass in a jazz band..

Defect Sampling – An Innovation for Focused Testing with Randy Rice

As software testers, our job is to find defects in software. Unfortunately, there are no signs that read, “Look for defects here!” Too often, testers look for defects in ways that involve guesswork and superficial tests that may or may not be productive. In addition, much time can be wasted in designing and performing unproductive tests.

In this webinar, Randy Rice reinforces the idea that all testing is sampling. The key question is, “Where and how do we take the best test samples?” Based on the results of sampling combined with the idea that defects tend to cluster, we can then focus additional testing to find other similar defects.

The good news is you don’t need a new tool or spend lots of money to perform intelligent sampling. The main thing needed is learning how to think about software defects in different ways.

Learning Objectives:

  • Learn about the true nature of software defects and why testers typically don’t find as many defects as they would like to find.
  • Learn an effective, proven process for designing and performing tests faster by sampling instead of trying to test everything.
  • Learn how you can apply the sampling concept to test automation by creating automated tests that sample, then creates new automated tests if a failure is discovered.

About the speaker

Randy Rice is a thought leading author, speaker and consultant in the field of software testing and software quality. He is an advisor to CEOs, CIOs and CFOs in organizations worldwide to improve the quality of their information systems and optimize their testing processes. Randy has over 38 years-experience building and testing mission-critical projects in a wide variety of environments. He is experienced with project management, software development life cycles, IEEE software engineering standards, and cybersecurity testing. Randy served as chair of the Quality Assurance Institute’s International Software Testing Conference from 1995 – 2000 and was a founding member of the Certified Software Tester (CSTE) certification program. He is co-author with William E. Perry of the books, Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems.

Making the Move to Behavior Driven Development with Ryan Yackel

Properly implemented Behavior Driven Development (BDD) helps to drive increased automation, achieve quicker development cycles, facilitate better collaboration between departments, and reduce siloed communication. Additionally, BDD is an ideal counterpart to continuous integration/delivery by solving testing bottlenecks. Despite the benefit, BDD is under-adopted. Studies find that between 10% and 25% of development organizations have implemented or are experimenting with a BDD process. Ryan will discuss how BDD moves testing up front to avoid rushed end-of-cycle testing and how automation is included from the start to achieve automation coverage. BDD has developers think about testability, build more testable software, and push software to customers just-in-time as it is developed. Ryan will also share a successful framework for evaluating your readiness for BDD, considering any potential roadblocks, and making a seamless transition. There will be plenty of takeaways for teams just learning about BDD, all the way through teams who have undergone a stable transition.
About the speaker

Ryan Yackel is the Director of Product Marketing at QASymphony, ensuring their continued commitment to innovation and delivering tools to create better software. With a deep interest in the emerging trends of testing needs, Ryan is dedicated to being the customer voice for all QASymphony products. Ryan comes to QASymphony from Macy’s Inc., where he managed testing on large enterprise initiatives delivering logistics implementations for their warehouse management systems. Ryan is a certified scrum master from Scrum Alliance and holds a Bachelor of Arts degree from Covenant College.

Focus on the Impact of Code Changes to make Agile more Agile with Mark Lambert

Agile is often mis-sold to senior management as a way of achieving quicker time-to-market, when the objective is really more accurate delivery to market. Teams are releasing more often, but it ultimately takes longer to get the complete functionality to market. As the team focuses on validating the new functionality implemented, a lack of understanding of the indirect impact of code changes causes detection of defects late in the release cycle – when they are complicated, time-consuming and costly fix.

What is needed is a way to understand the impact of the changes and identify where to focus testing efforts (unit testing, automated functional testing, and manual testing) to validate that existing features are not negatively impacted by the most recent changes.

Learning Objectives:

  • Apply testing practices to speed up quality during iterations, by focusing on changes in the code base between builds
  • Prioritize creation of new tests on the changed lines of code not covered by existing regressions
  • Leverage intelligent analytics, including Change-Based Testing, Modified Code Coverage, and Risky Code Change, to prioritize your agile testing activities

About the speaker

Mark Lambert is responsible for ensuring that Parasoft solutions deliver real value to the organizations adopting them. His team helps customers optimize their software development processes by assessing their specific development needs, then determining how to apply Parasoft technologies, processes, and methodologies to achieve their goals.

Improve Test Strategies and Outcomes with Mind Maps with Jennifer Bonine and Karen Schaefer

Do you ever sit in test strategy or test plan review sessions and get little or no participation from others? Are you looking for a better way to communicate important information around the test plan or strategy? Do you want your stakeholders to understand and engage in providing feedback and suggestions?

Jennifer Bonine and Karen Schaefer of tap|QA have a solution for you—a mind mapping tool that can help you address these questions!

A mind map is a visual approach used to help organize information rather than a text outline or list. Jennifer and Karen help you download a free mind mapping tool, train you how to use the tool, and take a real-life problem and solve it using the tool.

They will discuss the benefits of mind maps for solving problems that communicate information about your testing plan and strategy. You’ll be able to take a working approach back to your organization that you can use to improve velocity in your testing cycles and planning sessions.

Learning Objectives:

  • Learn what a mind map is, and how to create one
  • Learn how mind maps can help improve communication within a team
  • Specifically learn how to use a mind map for test plans / test strategies

About the speakers

Jennifer Bonine is a VP of global delivery and solutions for tap|QA, Inc., Jennifer Bonine began her career in consulting, implementing large ERP solutions. Jennifer has held executive level positions leading development, quality assurance and testing, organizational development, and process improvement teams for Fortune 500 companies in several domains. In an engagement for one of the world’s largest technology companies, Jennifer served as a strategy executive and in corporate marketing for the C-Suite. In her career, she has had several opportunities to build global teams from the ground up and has been fortunate to see how many of the world’s top companies operate from the C-Suite viewpoint.

 

Karen Schaefer is a QA Management, Process and Methodology expert with 25 years of experience. Karen has a strong track record of applying her analytical and organizational talent to continually refine her teams’ testing approaches in order to keep the ahead of fast-paced release cycles. She also uses her experience to mentor software testers in new methodologies, and has been co-chair of the Professional Development committee for a Women in Technology leadership team. Prior to joining tap|QA, Karen served in several quality assurance management positions at enterprise tech companies such as Calabrio, Univita Health, Shavlik Technologies and Stellent Inc.

Design Thinking in Agile with Jack Caine

Some of the principles behind the Agile Manifesto include:

  • Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.
  • Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
  • Business people and developers must work together daily throughout the project.
  • Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.
  • The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.
  • Working software is the primary measure of progress.
  • Continuous attention to technical excellence and good design enhances agility.
  • Simplicity–the art of maximizing the amount of work not done–is essential.
  • The best architectures, requirements, and designs emerge from self-organizing teams.

In this webinar, Jack highlights how design thinking can effectively fit in agile, scrum and SAFe and live up to the principles of the agile manifesto.
Learning Objectives:

  • The basic flow of design thinking
  • Various models incorporating design thinking into agile, scrum, and SAFe

About the speaker

Jack Caine is a seasoned enterprise level agile-lean transformation coach, trainer, mentor, facilitator, practitioner, program consultant, editor, author, and speaker with practical experience transforming organizations, coaching and training executives, directors and managers at the enterprise, portfolio, and program levels and ScrumMasters, product owners, analysts, QA, architects and tech leads, and developers at the team level.

Planning for SAFe PI Planning with Jack Caine

The Agile Manifesto states, “The most efficient and effective method of conveying information to and within a development team is face-to-face conversation.” SAFe takes this to the next level with PI Planning, a routine, face-to-face event, with a standard agenda that includes a presentation of business context and vision followed by team planning breakouts.

Here, the teams create their iterations plans and objectives for the upcoming PI Facilitated by the Release Train Engineer (RTE), this event includes all members of the ART, whenever possible. It takes place over two days, and occurs within the Innovation and Planning (IP) Iteration. The result of planning is a commitment to an agreed set of Program PI objectives for the next PI. Holding the event during the IP iteration avoids affecting the timebox, scheduling, or capacity of other iterations in the PI.

In this webinar, Jack highlights the primary activities and resources that an RTE uses to plan for the SAFe PI Planning sessions.

Learning Objectives:

  • What resources are available to help you plan your PI Planning session
  • What steps you might consider in preparing for your PI Planning session

About the speaker

Jack Caine is a seasoned enterprise level agile-lean transformation coach, trainer, mentor, facilitator, practitioner, program consultant, editor, author, and speaker with practical experience transforming organizations, coaching and training executives, directors and managers at the enterprise, portfolio, and program levels and ScrumMasters, product owners, analysts, QA, architects and tech leads, and developers at the team level.

Essential Patterns of Mature Agile Teams with Bob Galen and Shaun Bradshaw

Many teams have a relatively easy time adopting the tactical aspects of the agile methodologies. Usually a few classes, some tools’ introduction, and a bit of practice lead teams toward a fairly efficient and effective adoption. However, often these teams get “stuck” and begin to regress or simply start going through the motions—neither maximizing their agile performance nor delivering as much value as they could.

Borrowing from their experience and lean software development methods, join Bob Galen and Shaun Bradshaw in this interactive and collaborative workshop as they examine essential patterns—the” thinking models” of mature agile teams—exploring how mature agile operate so that you can model them within your own teams.

Along the way, you’ll examine patterns for large-scale emergent architecture, relentless refactoring, quality on all fronts, pervasive product owners, lean work queues, stretching above and beyond, providing total transparency, saying “No,” and many more. Bob and Shaun will also explore the leadership dilemma of self-directed teams and why there is still the need for active and vocal leadership in defending, motivating, and holding agile teams accountable.

Key Takeaways:

  • The importance of multi-level Done-ness criteria in driving team deliverable value
  • Practical execution tips for the team to efficiently work through their iteration tasks
  • Central practices to help you deliver on the high-quality promise of Agile Methods
  • Learn the mature collaborative style for agile teams to produce great results and how to achieve it
  • The key aspects of a strong agile customer and how that affects the overall team capacity to deliver value

About the speakers

Bob Galen is the Director of the Agile Practice at Zenergy. Bob is an in-demand agile adoption coach, trainer, and consultant with over 10 years of agile experience across Software, QA/Test, and Project Management. Bob’s specialty is Agile at-Scale challenges.

Shaun Bradshaw is the VP of Consulting Solutions at Zenergy. Shaun is an experienced test manager, consultant, and trainer with over 15 years of multi-domain experience. Shaun is a software QA/Testing strategist with deep Agile experience. Shaun is a certified Scrum Master.


How do you Build a Culture of Innovation? with Ripi Singh

It is a realization, that in spite of innovation being an executive priority, there are flaws in managing it. The situation can significantly improve when innovation is systematically managed In this webinar, Ripi highlights the way to inspire innovation for enterprise excellence. He further defines 5 levels of innovation robustness, and the 4 primary activities required to systematically progress the journey of innovation in your organization.

Key Takeaways:

  • Build a Strategic Roadmap of product and service offerings
  • Building expertise to deliver to the Roadmap promise
  • Developing Innovative products /services
  • Improving productivity through alignment of processes

Using live examples, and case studies, Ripi will also demonstrate the value of this approach. This approach has evolved over years of successfully managing turnaround and startup R&D units, and association with successful business leaders, and Engineering and Business schools.

About the speaker

Dr. Ripi Singh is an innovation and productivity coach with over 20 years of experience in product, technologies, process, and people leadership, spanning aerospace and defense, renewable energy and power, healthcare and medical devices, advanced manufacturing, and IT domains. During his service years culminating as Director R&D at Alstom Power (now a division of General Electric), Dr. Singh has successfully delivered advanced technologies on high impact and leading-edge aviation and energy programs. He also performed foundational research and engaged in teaching at various prestigious Universities around the world.
Dr. Singh has won numerous National and Corporate honors, has authored over 70 publications and has edited over 250 lectures, patents and books. His influence spans across mentoring young minds to coaching business owners, startups to growth companies, commodities like garments to futuristic technologies like drones.
Currently, Dr. Singh is proud to be on a mission to inspire innovation using a holistic approach he has developed over the years of being a technology and business leader.

Essential Patterns of Mature Agile Leadership with Bob Galen and Shaun Bradshaw

Currently so much of agile adoption—coaching, advice, techniques, training and even the empathy revolve around the agile teams. Leaders are typically either ignored or marginalized at best, and in the worst cases often vilified. But Bob Galen contends that there is a central and important role for managers and effective leadership within agile environments.

In this workshop, we’ll explore the patterns of mature agile managers and leaders. Those that understand Servant Leadership and how to effectively support, grow, coach, and empower their agile teams in ways that increase the teams’ performance, accountability, and engagement.

We’ll explore training and standards for agile adoption, and situations and guidelines for when to trust the team and when to step in and provide guidance and direction. We’ll examine the leader’s role in agile at-scale and with distributed agile teams. Good leadership is a central ingredient to sustaining your agile adoption. Bad leadership can render it irrelevant or a failure. Here we’ll walk the path of the good, but also examine the bad patterns to inspire you and your teams.

About the speakers

Bob Galen is the Director of the Agile Practice at Zenergy. Bob is an in-demand agile adoption coach, trainer, and consultant with over 10 years of agile experience across Software, QA/Test, and Project Management. Bob’s specialty is Agile at-Scale challenges.

Shaun Bradshaw is the VP of Consulting Solutions at Zenergy. Shaun is an experienced test manager, consultant, and trainer with over 15 years of multi-domain experience. Shaun is a software QA/Testing strategist with deep Agile experience. Shaun is a certified Scrum Master.


Quality Engineering in a DevTestOps World – A Strategic Enabler with Pradeep Govindasamy

The age of customer calls for continuous testing. A Forrester study confirms that unprecedented speed of software delivery has made quality and speed a strategic pursuit for enterprises. While true integration with Dev and Ops brings immediate visible acceleration, quality engineering through continuous testing ensures the outcomes of DevOps results in market leadership. This phenomenon called DevTestOps is an option that leading global analysts and researchers are advocating for impeccable software that propels businesses as customers look for first time ready software.

The concept of DevTestOps can imply either the merging of the Development, Testing, and Operations teams of an organization, or it could mean forming a middle layer of sorts that contains elements of both the teams. Whatever may be the case, DevTestOps is fast emerging to be the key to enterprise cloud. DevTestOps can be viewed as a potential solution to proving the benefits of cloud and improving the efficiency of IT operations, thereby accelerating the faster release of applications.

DevTestOps is the mixture of practices, philosophies, culture, and tools that enhance an organization’s ability to deliver applications and services at significantly high velocity. This enables enterprises to serve their customers better and obtain a competitive advantage. The services spectrum in DevTestOps streamline management of the infrastructure, deployment of the app code, and automation of the release process. Software testing services help ensure that the various technology tools and solutions are adeptly adopted by enterprises and function robustly.

In this webinar, Pradeep Govindasamy, CTO & President at Cigniti Technologies, will be presenting the Quality Engineering scenarios across the DevTestOps IT lifecycle that includes BDD, Release Automation, Environment, Data, Virtualization, Build, and Continuous Integration/Continuous Delivery.

Key Takeaways:

  • Learn newer Quality Engineering practices and methodologies on Shift left Testing strategy
  • Newer technologies like Service virtualization, Early automation and data
  • Role definition of SDET (Software Developer Engineer in Test) in the DevOps IT lifecycle
  • Understanding of Quality Engineering Tools in the CI/CD (Continuous Integration/Continuous delivery) Platform

About the speaker

Pradeep Govindasamy is the Chief Technology Officer and President at Cigniti Technologies, managing Cigniti North America. In this role, he is responsible for revenue targets, business growth and technology strategy for Cigniti NA. Pradeep is an industry thought leader in software testing with over 15 years of experience. He has strong expertise in setting up Testing Centers of Excellence around Test management, Automation, Mobility, SOA and Functional testing. Pradeep has won many industry awards and recognition including “Top 10 CTOs from Texas” and “Top 100 Change agents in the IT”,. He was also instrumental Cognizant winning the 2013 Informatica Innovation Award for Test Data Management. Prior to Cigniti, he held multiple leadership roles at Cognizant since 2001.

Cybersecurity Technical Risk Indicators: A Measure of Technical Debt with Joe Jarzombek

As cyber threats evolve and as software dependencies grow more complex, understanding and managing software throughout the lifecycle is more critical than ever. The Internet of Things (IoT) is contributing to a massive proliferation of a variety of types of software-reliant, connected devices throughout critical infrastructure sectors. With IoT increasingly dependent upon third-party software, software composition analysis and other forms of testing are needed to determine ‘fitness for use’ and trustworthiness in terms of quality, security, safety, and licensing. Application weakness and vulnerability management should leverage automated means for detecting threat indicators, weaknesses, vulnerabilities, and exploits. Using standards-based automation enables the exchange of information. Leveraging cybersecurity Technical Risk Indicators as a measure of technical debt can assist in software supply chain risk management efforts by providing a means to understand risk exposures attributable to exploitable software.

Learning Points:

  • Organizations rely on third-party software that should be tested prior to use or integration within new software.
  • Application weaknesses and vulnerabilities can be detected and mitigated in development and testing prior to use or integration in corporate assets.
  • Technical Risk Indicators, derived from ITU-T CYBEX 1500-series standards, can be used by professions seeking to improve software quality and security.

About the speaker

Joe Jarzombek is Global Manager for Software Supply Chain Management for the Synopsys Software Integrity Group. At Synopsis he leads efforts to enhance the Software Integrity Platform to mitigate software supply chain risk via automated analysis and testing technologies. Prior to joining Synopsys, he served as the Director for Software & Supply Chain Assurance in the US Department of Homeland Security Office of Cybersecurity and Communications. He is a retired Lt Colonel in the US Air Force.

Testing in an Internet of Things’ World! with Bob Crews

This webinar presents Quality Assurance and Software Testing concepts specific to the Internet of Things and its exciting challenges! Technology is evolving quickly and significantly. Our approach to quality MUST change with it! This presentation will focus upon the technical aspects of IoT systems and applications as well as the technical considerations specific to the software testing ecosystem. Join this webinar to get a better understanding as to how our processes, roles, methodologies, software testing solutions, and strategies must evolve. A detailed test case and script, specific to an IoT application scenario, will be analyzed. Questions while planning our pursuit of quality will become obvious. How will we test? Why will testing “in the wild” be critical? How will we utilize test automation? How must we change now in order to be ready for what lies ahead? This webinar will present unique ideas that will be expanded upon at QUEST2017!:

  • The technical definition of Internet of Things and a “smart” device
  • How Quality Assurance and Quality Control must evolve
  • The strategies of test planning, manual testing and test automation in IoT world

About the speaker

Bob Crews, President of Checkpoint Technologies, is a consultant and trainer with over 26 years of IT experience including full life-cycle development involving development, requirements management, and software testing. He has consulted and trained for over 240 different organizations in areas such as addressing mobile testing challenges, effectively using automated testing solutions, test planning, risk analysis, implementing automated frameworks, and developing practices which ensure the maximum return-on-investment with automated solutions. Bob has presented at numerous conferences and user groups throughout the world including QAI, EuroStar (Copenhagen), HP Software Universe, and LatinStar (Mexico City). Bob was named as one of the top five speakers at the QAI Annual Software Testing Conference in 2004.

Technology Megatrends and QA: Ready, Set, Evolve! with Anne Hungate

“What’s dangerous is to not evolve.” Jeff Bezos, CEO of Amazon offered these words of wisdom – and with his leadership, Amazon is now worth more than Macy’s, Sears, and Target combined. For those of us QA professionals, faced with real deadlines given by our boss, and constant demands at home, knowing where and how to evolve is the tricky part. Without a starting point, we will wait another year for the path to become clear. Join this Webinar to get a practical understanding of the major technology trends facing all industries – and leave with a plan to direct your evolution. Learn why and where automation matters and how you can apply your knowledge of your customers’ needs to grow your organizatio’’s brand. Bring this passion to your own personal learning journey and start the year with a plan to invest in yourself. Outcomes:

  • Understand the megatrends executive leadership is facing
  • Learn where QA must evolve to flow with the megatrends
  • Form your own, personal learning plan to drive your career with the megatrends and QA evolution

About the speaker

Anne Hungate discovered her passion for software quality after working in multiple IT roles such as developer, analyst, and program/project manager for more than twenty years in major companies and consulting firms. Experimenting with both engineering practices and organizational design, Anne determined that people and trust are the keys to better software. Most recently, she led the global quality team for a leading financial services organization helping the team prepare for Agile adoption. Anne has presented at local and national conferences, sharing the lessons learned on her software quality journey. Anne holds CSQA and PMP certifications.

Improve your Retrospectives with Agile Kaizen with Angela Dugan

Continuous self-improvement of software teams is traditionally accomplished through retrospectives, a form of post-mortem held at the completion of an iteration. More often than not, retrospectives begin to fade and the list of action items keeps growing until teams simply succumb to business-as-usual practices. In some cases, teams eventually abandon retrospectives altogether because they feel like a waste of time!

  • Do you feel like your retrospectives are a death march where no one is actively participating?
  • Do the same problems seem to resurface repeatedly in the team’s retrospectives?
  • Are your retrospectives ending prematurely or being cancelled in favor of “getting more real work done”?
  • Or maybe you feel great about your agile retrospectives, but just want to learn more about Kaizen…

Join Angela as she explains how you can use Kaizen to analyze and improve your retrospectives, regardless of your team’s process. She will begin with a brief review of what a retrospective is and walk through some examples of both healthy and unhealthy retrospective scenarios she has experienced herself. Angela will then explain the concept of Kaizen, the Kaizen process, and how you can leverage a Kaizen process to turn your retrospectives back into the effective continuous improvement tools they are meant to be!

Learning Objectives

  • Determining if your current agile retrospectives are effective
  • Learn Kaizen Burst techniques
  • Using Kaizen in agile retrospectives

About the speaker

Angela Dugan is the ALM Practice Manager for Polaris Solutions, a small technology consulting firm based out of Chicago and St. Louis. She has been in software development since 1999, including 5 years as an ALM Tools evangelist with Microsoft. Angela also runs the Chicago Visual Studio ALM user group, is an active organizer and speaker at several local conferences, is a Microsoft ALM MVP, and is both a Certified Scrum master and SAFe Program Consultant. Outside of wrangling TFS, Angela is an avid board gamer, an aspiring runner, and a Twitter addict. She lives in a 1910 house in Oak Park, Illinois that she is constantly working on/cursing at with her husband David.

Metrics That Matter – In the Context of Software Testing and QA with Bernd Haber

Often times, IT Management has little patience or time for detailed test status reports because the metrics are hard to interpret. By the same token, they misunderstand the objectives of software testing. Hence, the perception of testing is that it is an effort to improving solution quality, that it is considered a linear and independent task, and that test results are assumed to stay valid over time. Control is an important aspect, maybe the most important, of any software project, including during the testing lifecycle. Significant time and effort (money) is invested in preparing testing dashboards with detailed metrics and reports. But, many wildly successful projects have proceeded without much control, like Google Earth or Wikipedia. This session reviews how different kinds of projects have different control needs and changing expectations of what can be controlled. It will present some alternative and less complex approaches to metrics and reporting.

Learning Objectives

  • Expand the understanding of a advanced software test metrics program and framework as related to KPI value levers, such as quality, productivity, maturity and cost
  • Provide a more nuanced insight in to the world of software testing metrics for testing practitioners and test project leads
  • Provide a reference point for test project leads to adjust an existing test metrics approach in order to accommodate senior leadership needs and expectations

About the speaker

Bernd Haber is responsible for Accenture’s North America Testing Service for the Products Industry Operating Group that includes clients in Retail, Life Science, Consumer Goods, Airline, Automotive and Hospitality. He is a senior executive member of Accenture Testing Platform, the firm’s Global Testing Practice and Testing Community of Practice. Bernd specializes in the field of test strategy development, test operation transformation, process performance & quality assurance, as well as test metrics and measurements. He is one of the winners of 2011 Accenture’s Inventor Award program as related to his patent-pending QA Metrics Dashboard solution. Bernd has been with Accenture for more than 22 years and holds a Master’s degree in Mechanical Engineering and Computer Aided Manufacturing.

Pairwise Testing: What it is, When to Use and Not to Use with
Philip Lew

Many defects only occur when a combination of inputs or events occur that interact with each other. However, if you were to test every combination of inputs it may take years especially with today’s complex systems. How do you choose which combination of inputs? Pairwise testing. Pairwise testing is a combinatorial technique for reducing the number of test cases without drastically compromising functional coverage, in order to get more ‘bang for the buck’.

Philip Lew explains the nuts and bolts of pairwise testing, how to incorporate pairwise testing into your test design and planning, and when to use other combinatorial techniques. Learn about some of the tools that can be used as Phil examines when and when not to use pairwise testing, some of its advantages and limitations. He’ll also provide a live demonstration of using pairwise testing and more.

Learning Objectives

  • What is pairwise testing and what are it’s advantages and disadvantages.
  • When to use pairwise testing and how.
  • How and when to use other combinatorial techniques beyond pairwise testing.

About the speaker

Philip Lew, CEO of XBOSoft, oversees strategy, operations and business development since founding the Company in 2006. His broad experience spans across deep technical expertise as a software engineer, advising on technology and business processes to founding companies such as Pulse Technologies Inc, a leader in contact center systems integration, until it’s acquisition by EIS International. In a space of 25 years he has served as an Ernst and Young Consultant, led the Systems Integrations Services Group at EIS, held roles at executive level both in USA and Europe and serves as an Adjunct Professor at Alaska Pacific University. As well as presenting at leading worldwide conferences such as STPCon, PNSQC and Better Software East-West, StarEast-West, his papers have been published in ACM, IEEE, Project Management Technology, Telecommunications Magazine, Call Center Magazine, TeleProfessional, and DataPro Research Reports.

Compatibility Testing for Mobile Devices with Michael Yudanin

Mobile testing presents a number of challenges that require special attention. One of these is testing your apps and websites for compatibility with different mobile platforms. Mobile devices differ in terms of operating systems and their flavors, processing power, memory, display size, resolution – all this in addition to the familiar challenge of multiple browsers. Moreover, the mobile market is not only diverse but also extremely dynamic: new OS versions and new hardware show up quite frequently. How do we focus on what can cause issues rather than repeating all our tests on all platforms? How do we implement the principles of compatibility testing for the mobile market, covering the bases and minimizing risk without turning testing into a continuous nightmare and our cubicles – into smartphone warehouses? What are the relevant factors that determine on which devices and with which operating systems to test? What are the principal differences between testing mobile websites and native apps as far as compatibility is concerned? What are the criteria for deciding whether to purchase devices or rent a lab?

Learning Objectives

  • The main differences between mobile compatibility testing and compatibility testing for PC and Mac applications and websites.
  • The main factors that determine the selection of tests and platform combination for mobile compatibility testing.
  • Deciding whether to purchase multiple devices, rent a lab, or use a hybrid approach.

About the speaker

Michael Yudanin is the CEO of Conflair, a QA and testing company. He has been working on automating tests for mobile devices since before mobile apps and smartphones became commonplace. Michael developed RealMobileTM, a unique approach to using common automation tools to automate testing of mobile apps and websites. Among the large enterprises that have benefited from this approach to mobile testing are Home Depot, Bank of America, The Weather Channel, and Spirit Airlines. Michael is a frequent speaker at testing conferences and regularly delivers classes on test planning, requirements management, test automation, XML, web services testing, and other subjects.

How Agile are you? Creating a High Maturity Agile Implementation with Daniel Tousignant

Understanding Agile maturity is key to having a successful Agile implementation. Like many maturity models and similar to the CMMI levels, Agile maturity can be measured from Level 1- Initial or “Ad Hoc” to Level 5 Optimizing or “Culturally Agile”. Fortunately, the Agile Manifesto helps us create a roadmap to assess where we are in our path to Agility. By review the 4 values and 12 principles we know what questions to ask in order to assess our maturity. This webinar will help you understand the path to Agile maturity and how to gain access to a free self-assessment tool to gauge your organizations Agile maturity.

About the speaker

Daniel Tousignant
Dan is a lifelong project manager and trainer with extensive experience in managing software development projects. Based upon his experience, he has adopted Agile methods for developing and implementing software. He is also passionate about the Agile approach of leadership emerging from self-organizing teams. Dan has over 20 years of experience providing world class project management for strategic projects, direct P& L experience managing up to 50 million dollar software development project budgets, experience managing multi-million dollar outsourced software development efforts and strong, demonstrated, results-driven leadership skills including ability to communicate a clear vision, build strong teams, and drive necessary change within organizations. Dan holds a Bachelor of Science majoring in Industrial Engineering from the University of Massachusetts, Amherst and is a Certified Project Management Professional, Professional Scrum Master, PMI Agile Certified Practitioner and Certified Scrum Professional and is the owner of Cape Project Management, Inc.

Lean and Enterprise Agile Frameworks with Dr. David Rico

Dr. David F. Rico will give a presentation on “Agile Enterprise Frameworks: For Managing Large Cloud Computing Projects,” which are emerging models for managing high-risk, time-sensitive R&D-oriented new product development (NPD) projects with demanding customers and fast-changing market conditions (at the enterprise, portfolio, and program levels). Dr. Rico will establish the context, provide a definition, and describe the value-system for lean and agile program and project management. He’ll provide a brief survey and comparative analysis of the pros and cons of emerging lean and agile frameworks such as Enterprise Scrum, LeSS, DaD, SAFe, and RAGE. Then he’ll describe the Scaled Agile Academy’s Scaled Agile Framework (SAFe) in greater detail (which is the de facto international standard for scaling the use of agile methods to the enterprise, portfolio, and program levels for both systems and software development). SAFe is hybrid model best known for “blending” megatrends such as lean and agile principles into a single unified framework, establishing an authoritative foundation for scaling agile methods to large-scale private and public sector programs, and unifying East (lean) and West (agile) into a common language for systems and software development that is both lean “and” agile. In addition to SAFe case studies, late-breaking developments on the use of “Continuous Delivery,” “DevOps,” and bleeding-edge “Unstructured Web Databases” at Google and Amazon to automate large sections of the enterprise value stream will be discussed (which has been successfully used by some of the world’s largest firms to boost organizational productivity by one or two orders of magnitude).

About the speaker

David Rico
Dr. Rico helps oversee a portfolio of large multi-billion dollar IT projects. He has been a technical leader in support of NASA, U.S. Navy, U.S. Air Force, and U.S. Army for over 30 years. He has led over 20 change initiatives based on Cloud Computing, Lean Thinking, Agile Methods, SOA, Web Services, Six Sigma, FOSS, PMBoK, ISO 9001, CMMI, SW-CMM, Baldrige, TQM, DoDAF, DoD 5000, etc. He specializes in IT investment analysis, portfolio valuation, and organization change. He has been an international keynote speaker, presented at leading industry conferences, written seven textbooks, published numerous articles, is a reviewer for multiple journals, and is a frequent PMI, INCOSE, ALN, and SPIN speaker. He is a Certified PMP, CSEP, ACP, CSM, and SAFe Agilist, and teaches at five Washington, DC-area universities. He holds a B.S. in Computer Science, M.S. in Software Engineering, and D.M. in Information Systems. He has been in the IT field since 1983.

Root Cause Analysis – Making decisions with Jeremy Berriault from Manulife Financial with Jeremy Berriault

Decisions are made based on the data available at the time. Some decisions can be made fairly easy and quick such as deciding what to eat. On the other hand there are many decisions require more thought on the right direction such as the type of mortgage or investments. Senior leadership makes numerous decisions that affect many groups within an organization; some decisions may seem trivial and some may negatively affect individuals. The amount of data required to assist with these decisions is what we need to understand. How much data did they have that caused them to reach that decisions?

Root Cause Analysis is one set of data that affects IT projects. Decisions such as resourcing, budgets, and project selection are sometimes based on how much rework was done on previous projects. Providing a count of defects within specific categories, such as code or requirements, only provides a partial story of what is occurring within an organization. It could also create friction between groups as the subjective and vague nature does not provide the deep understanding to what is truly happening. This webinar will provide insight into the right amount of data needed to assist those decisions.

Learning Objectives

  • Discover how performing root cause analysis can improve your QA group’s value to your organization
  • Learn how to introducing cost/benefit analysis for continuous process improvement efforts
  • Explore how to use root cause analysis to achieve collaboration with stakeholders

About the speaker

Jeremy Berriault
Jeremy Berriault has been in the testing discipline for over 20 years within the Canadian Banking industry. He is currently the Director of the Quality Assurance Center at Manulife Financial, Group Functions Division. He is responsible for QA processes, job families and QA training curriculum across the enterprise. Jeremy holds an MBA, completing a research project on the attitudes and views business and testing groups have for each other as part of his master’s program. His research provided reasoning as to why each group would think the way they do and solutions to help resolve issues. Jeremy’s main drive is to help bring attention to the value testing groups can bring to an organization, not just providing assurance of quality software, but also financial and efficiency benefits across the organization.

Time to Cut the Cord with Dan McFall

Quality Assurance organizations faces tough challenges managing the mobile devices needed to get the job done every day. Existing tools can force you to tie precious devices to a single machine and then pass them around physically when it’s time to share or collaborate. This eliminates efficiency in resource utilization and can impact test coverage. We’ll discuss what makes mobile so different than other types of testing from an environmental and process standpoint and general strategies for recapturing that lost efficiency.

Learning Objectives

  • USB-tethering of devices for use by developers, QA, and support professionals diminishes team agility.
  • Without devices under management as part of a highly available infrastructure, DevOps efficiencies can evaporate in mobility.
  • Considerations for types of mobile testing and the wisest places to spend your time.

About the speaker

Dan McFall
Dan McFall is a seasoned, knowledgeable software professional with extensive experience spanning mobility, manual and automated mobile testing, secure test device management, private cloud, Agile software development and technical support. Currently, Dan is the vice president of mobility solutions at Mobile Labs, a leading provider of mobile testing and secure test device management solutions. In his role, Dan works with global organizations to improve their development and QA processes around mobile device and application testing. Dan is a graduate of the Georgia Institute of Technology with a degree in Industrial and Systems Engineering.

Metrics: The Force Awakens with Joseph Ours

It is often said, “You cannot improve what you cannot measure.” That statement has led to a proliferation of measure and metrics gather programs throughout history. In software testing, metrics are used frequently to inform stakeholders regarding the quality and/or progress of testing in a project. Many time metrics are presented in visual form in order to tell a compelling story – often to influence decision making. That makes metrics a life force in the universe of quality assurance. In this presentation, we will discuss some common quality assurance and testing metrics and demonstrate how the force can be manipulated for good and evil. For those with ill intent, they will learn how to manipulate the metrics for their own purposes. For those pure of heart, they will learn how to see past the visual and defend against the dark arts.

Learning Objectives

  • The purpose of metrics
  • How to display metrics to tell your story
  • How to spot when someone is being told an inaccurate story with metrics

About the speaker

Joseph Ours
Joseph Ours draws on 15 years of experience providing executive-level leadership while managing high profile initiatives with a demonstrated ability to lead people toward successful delivery. Throughout his diverse career, he has built a solid reputation as a thought leader who exhibits a results-driven business approach and exceptional ability to achieve success. He is a strong leader in business processes with a proven history of providing project and portfolio management of large technology initiatives. Joseph brings both a strategic and tactical thought process to solving IT related issues. He holds bachelor’s degrees in electronic engineering technology and technical management in addition to a Master’s of Business Administration.

Enterprise Agility Starts with Healthy Teams, How Healthy is YOUR Agile Team? with Sally Elatta

Everyone wants metrics, but which ones really matter? Which metrics can help you ‘actually’ get better and give you visibility into the health of your teams? Take a deeper dive with our dynamic Agile Expert, Sally Elatta, as she walks you through the top 5 metrics you need to be looking at and how you can create a continuous growth process where teams, programs and portfolios are getting better quarter after quarter. All the attendees shall have access to download the powerful TeamHealth radar and try it with your own teams!
Learning Objectives

  • How do you really measure TeamHealth and what metrics should you look for?
  • How to create a continuous growth process that is predictable and measurable.
  • Appreciate the powerful TeamHealth radar.

About the speaker

Sally Elatta
Sally is dynamic consultant, trainer and coach who is passionate about transforming people, teams and organizations. Her unique mix of technical, business, leadership, and soft skills help her transform individuals at all levels. Sally has developed a unique set of results-driven training workshops that use real world best practices.

 

 


Why Test Automation Fails with Jim Trentadue

Challenges in automation which testers face often lead to subsequent failures. Learn how to respond to these common challenges by developing a solid business case for increased automation adoption by engaging manual testers in the testing organization, being technology agnostic, and stabilizing test scripts regardless of applications changes. Learn Jim Trentadue’s explanations of a variety of automation perceptions and myths:

  • The perception of significantly increased time and people to implement automation.
  • The myth that once automation is achieved, testers will not be needed.
  • The myth that automation scripts will serve all the testing needs for an application.
  • The perception that developers and testers can add automation to a project without additional time, resources or training.
  • The belief that anyone can implement automation.

About the speaker

Jim Trentadue
Jim Trentadue has over 15 years of experience as a coordinator/manager in the software testing field. He has filled various roles in testing over his career, focusing on test execution, automation, management, environment management, standards deployment, and test tool implementation. In the area of offshore testing, Jim has worked with multiple large firms on developing and coordinating cohesive relationships. Jim has presented at numerous industry conferences including the Rational Development Conference, IIST, and QAI chapter meetings. Jim has acted as a substitute teacher at the University of South Florida’s software testing class, mentoring students on the testing industry and trends for establishing future job searches and continued training.

Enterprise Agile Failure Modes and Solutions with Hillel Glazer

Agile adoption holds so much promise it sounds too good to be true. Often transformation efforts to adopt agile get off to impressive starts. Pilot projects typically succeed and enthusiasm is high. However, when moving towards broader adoption and attempting to institutionalize sustained agile practices, the successes of the pilot efforts fade into the past and organizations find themselves frustrated with poor traction and increased headaches. These experiences are often accompanied by a slide in process maturity from what were prior accomplishments with establishing standard process assets and tools. These challenges not only appear among companies transitioning to agile, they even appear among companies who have never used anything but agile. So if agile is so great, holds so much promise and seems to succeed with so many teams, what causes these problems? Shouldn’t agile have solved them? The webinar will begin by introducing a framework for understanding different types of companies and how these differences are critical to successful transition to (or use of) agile methods at scale. Companies have differing delivery constraints and business drivers that are likely working against even the best of agile transformation. Next we will explore a strategy for establishing an end state vision and an operational model to guide transformation. Finally, we’ll define an approach for incrementally introducing change, measuring outcomes, and sustaining the change once things really get going and keep them going at scale.

About the speaker

Hillel Glazer
Hillel Glazer is the founder, Principal, CEO, and “all-around Performance Jedi” of Entinex, Inc., a Baltimore, MD-based management consulting firm made up of aerospace (and other) engineers. Hillel is an internationally recognized authority on bringing lean and agile values and principles into the regulated world, and he was selected as a Fellow of the Lean Systems Society in its inaugural fellows induction. Hillel and his company have close ties to the CMMI Institute at Carnegie Mellon University in Pittsburgh, where he serves as an advisor on ways to bring together CMMI (Capability Maturity Model) process improvement models that create high-performance, high-maturity cultures, with lean and agile practices. He is the author of the 2011 book, “High Performance Operations,” and has written widely on the subject of high performance systems, models and organizations including the world’s first peer-reviewed, professionally edited article on CMM and Agile in 2001 as well as the SEI’s first official Technical Report on Agile and CMMI in 2008.

You Want to Use SCRUM, You Are Told To Use CMMI– How They Can Work Together Elegantly with Neil Potter

If you are a software engineer or IT professional, your group has very likely shown a strong interest in reducing costs, improving quality and productivity. Your group might also have looked at various pre-packaged frameworks, such as Agile (e.g., Scrum and Extreme Programming), CMMI and Six Sigma. At first glance, each of these frameworks might look at odds with each other, making it difficult to use two or more. This typically occurs because much of the information shared regarding these frameworks is from un-researched opinions and failure stories, rather than understanding the specifics of each framework. Each framework can be implemented successfully depending on how much care is placed on its implementation. In this session, CMMI and Scrum are compared since they are two of the most commonly used frameworks and groups frequently struggle with using them together.

About the speaker

Neil Potter
Neil has been working in the software application and IT fields since 1985 helping companies improve their performance. He has 28 years of experience in software and process engineering. Neil Potter is co-founder of The Process Group, a company formed in 1990 that consults on process improvement, CMMI, Scrum, software engineering and project management. Neil is a CMMI-Institute certified lead appraiser for SCAMPI appraisals, Intro to CMMI instructor (development and services), Six Sigma Greenbelt and Certified Scrum Master. Hands-on workshops on Scrum/CMMI/requirements/planning/estimation/inspection/supplier management Facilitation and organizational change He has a B.Sc. in Computer Science from the University of Essex (UK) and is the co-author of Making Process Improvement Work – A Concise Action Guide for Software Managers and Practitioners, Addison-Wesley (2002), and Making Process Improvement Work for Service Organizations, Addison-Wesley (2012).

Agile Resiliency: How CMMI will make Agile Thrive and Survive with Jeff Dalton

Large corporations and the Federal government are increasingly directing software developers to “be agile,” but business practices related to marketing, procurement, project management, and systems definition are anything but. While more developers are living in an agile world, the business continues to live in waterfall surroundings. It’s not a conflict that is easily resolved, but there is opportunity to take control of the debate. Why not embrace both?

About the speaker

Jeff Dalton
Jeff Dalton is Broadsword’s President, Certified Lead Appraiser, CMMI Instructor, ScrumMaster and author of “agileCMMI,” Broadsword’s leading methodology for incremental and iterative process improvement. He is Chairman of the CMMI Institute’s Partner Advisory Board and President of the Great Lakes Software Process Improvement Network (GL-SPIN).

 

 


Automated Software Testing: Practices that Yield Positive Results with Elfriede Dustin

This webinar describes various automated software testing practices that have yielded the positive results required of an automated test program. We will provide proven examples of best practices in a scriptless automated testing environment using image-based capture. Not only is it important that a capable automated software testing solution is used to meet specific automated testing requirements, but also that the appropriate capture techniques are applied. Often too much time is spent on automated software testing maintenance. This webinar will provide workaround suggestions and ideas for avoiding maintenance difficulties that can lead to shelved automated software testing solutions. You will also gain insight to the ideal requirements an automated testing solution should meet in order to be able to implement the proposed selected best practices.

About the speaker

Elfriede Dustin, IDT
Elfriede Dustin has over 20 years of IT experience implementing effective testing strategies both on government and commercial programs. She is the author or co-author of six books related to Software Testing, including Effective Software Testing, Automated Software Testing, Quality Web Systems, and The Art of Software Security Testing. Elfriede has implemented automated testing methodologies as an Internal SQA consultant at Symantec, worked as an Assistant Director for Integrated Testing at the IRS Modernization Efforts, built test teams as a QA Director for BNA Software, and was the QA Manager for the Coast Guard MOISE program. Her goal is to continue to help further automated software testing advances.

Career Planning for Agile QA with Bill Rinko-Gay

Bill Rinko-Gay will present Career Planning for Agile QA. Based on Bill’s experience with Agile transformations and his many years in Quality Assurance, Bill will discuss what QA professionals can do to ensure that they have the right skills for employment as the computing industry transforms itself to Agile methods. Join this webinar and gain insight to how QA is different when software is developed with Agile methods and how to adapt QA to popular Agile management and development tools. You will learn key characteristics of the Agile QA professional and what skills should be acquired or enhanced. Attendees will be able to adjust training and career plans to take the move to Agile into account.

About the speaker

Bill Rinko-Gay, Agile Integrity, LLC
Bill Rinko-Gay is the founder and Contributing Member of Agile Integrity, LLC. Currently, Bill is working as a Transformation Agent and ScrumMaster for Macmillan Higher Education. Bill has been involved in Software Test and Quality Assurance since 1982 when he began testing command and control software for orbiting satellites for the US Air Force. Since leaving the DOD, Bill has worked on projects in defense, computer manufacturing, publishing, network security, financial, state and local governments. Beginning his fourth decade in the field, Bill is still improving techniques to allow teams to produce excellence. His most recent work is in Scrum and agile quality assurance, improving software quality in the 21st century. A regular speaker and trainer for QAI affiliate organizations, Bill currently holds PMP and Certified ScrumMaster certifications.

Software Quality Metrics Do’s and Don’ts with Philip Lew

Don’t just measure and track progress and then deliver reports that no one reads. This webinar discusses some of the most common mistakes in using metrics. The primary take away is to learn from the mistakes of others, particularly where to use and not use metrics to measure your testing and QA efforts. The last thing you want is to measure the wrong thing and create unwanted behavior. With knowledge of what not to do, we’ll then dive into how to develop a measurements and metrics framework that aligns with the organization’s business objectives. This means taking on a manager’s viewpoint so that your metrics don’t just measure testing progress, but also measure product quality and how it impacts an organization’s bottom line. As part of the webinar, we’ll discuss a variety of metrics that can be used to track work effort with results and enable you to plan and forecast your testing needs.

About the speaker

Philip Lew, XBOSoft
With extensive experience in a variety of management and technical positions in software development and product management, today, Philip Lew leads XBOSoft’s (www.xbosoft.com) direction and strategy as their CEO. His Ph.D. research in software quality and usability resulted in several IEEE and ACM journal publications. Philip has also been published in a number of trade journals. He has presented at numerous trade and academic conferences and in the past 20 years and has worked with hundreds of organizations to assess the quality of their software, examine software quality processes, and set forth measurement plans to improve software quality using systematic methods. He received his Bachelor of Science and Master of Engineering degrees in Operations Research from Cornell University. His post-doctorate research focuses on software quality and usability measurement.

Avoiding Common Performance Test Planning Pitfalls with Vic Soder

In large scale application implementation and integration projects, there are often well-intentioned performance test plans established to demonstrate readiness for a production go-live. Unfortunately, what numerous projects have experienced is that even with a formal performance test plan, a strong set of performance testing tools, and a team of well-trained performance test execution staff, the results of a formal performance test and analysis program often come up short of expectations. In this webinar, Vic will discuss a variety of challenges that must be planned for – and overcome – to ensure that a performance testing program is successful. These include having a representative and dedicated performance testing environment, a fully functioning application with production-like test data and identification and impact of workloads. Learn how to overcome these and other challenge in performance testing.

About the speaker

Vic Soder, Deloitte Consulting, LLP
Vic Soder is a Director with Deloitte Consulting LLP’s Systems Integration service line, with over 20 years of experience in Deloitte’s Technology practices. He leads Deloitte’s national Application Performance Center of Expertise, specializing in application performance, system sizing, and capacity planning advisory services. Vic has provided performance and capacity planning services to a wide variety of large clients in multiple industries. He previously worked in positions with Boole & Babbage and the Institute for Software Engineering, and holds a B.S. degree in Mathematical Sciences from Stanford University.

A Day in the Life of a Test Manager – A Case Study & Research Report with Mike Lyles

Test Managers: the world is full of them. But do we really know what they do? Does your organization agree on the core roles, challenges that test organizations face, and responsibilities of the team? The answer is probably “No”. If so, then this webinar is for you!Join Mike Lyles to review the research he conducted on test management as he examines:

  • Notes from interviews with experts / leaders in testing on the following topics:
    • The Test Manager’s Role
    • Most Significant Challenges
    • People Management
    • Conflict Resolution
  • Summary of discussions from blogs, LinkedIn groups, and Twitter
  • Results of a survey taken by 275+ test professionals on test management and leadership
  • Test Manager responsibilities: textbook vs. real-life
  • Areas where testing organizations are not always aligned
  • Soft Skills and mentoring skills that every test leader should possess to be effective
  • Suggestions to maintain relevance in a constantly evolving testing world

About the speaker

Mike Lyles
Mike Lyles is a Sr. QA Manager with Lowe’s Companies, Inc. Mike has over 20 years of IT experience, working in various roles over the years – from technical support to programmer, to Program Management Office, to Solutions Development Manager, to Testing/QA. His current role includes Test Management responsibilities for a major company domain covering Store Systems, Supply Chain, Merchandising, Marketing, and Middleware

Acceptance Test Driven Development with Dr. Timothy Korson

As a testing practitioner you have your head down testing company projects, but as a testing professional you need to look up once in a while and consider trends in the testing community. This webinar will consider a trend that, if it hasn’t already, is likely to impact you in the near future: Acceptance test driven development (ATDD). ATDD originated in the agile community, but has broad applications even for traditional development organizations with separate test and QA departments. This webinar will teach you how to use the concepts of ATDD to increase your value to your organization no matter what testing methodology they use.

Key learning points

Webinar attendees will learn how ATDD can help:

  • Dramatically increase the efficiency of test organizations
  • Increase product quality
  • Make products more testable

About the speaker

Tim Korson
Dr. Timothy Korson is a Scrum trainer and coach. He has had over a decade of significant experience working on a large variety of systems developed using modern software engineering techniques. This experience includes distributed, real time, embedded systems as well as business information systems in an n-tier, client-server environment. He has helped design and taught in a Master’s of Software Engineering program and has been an international industry consultant since 1985. Tim has authored numerous articles, and co-authored a book on Object Technology Centers, Object Technology Centers of Excellence. He has given frequent invited lectures at major international conferences and has contributed to the discipline through original research.

Addressing Mobile App Testing Challenges with Lee Barnes

If the mobile technology train hasn’t arrived at your organization yet, it soon will. Are you ready to jump onboard and face the unique testing challenges presented by mobile applications? In this session, Lee will lead a journey to help you understand where mobile quality is, where it’s going, why it matters to you, and what you can do to help ensure mobile quality in your organization. Lee’s presentation will highlight testing challenges specific to mobile apps and present mobile testing best practices. You will understand why testing in a mobile environment is different from traditional software testing and learn how to address the unique testing challenges presented by mobile applications. Attend this talk and walk away with a solid mobile testing baseline and best practices for addressing the challenges that lie ahead.

Key learning points

Webinar attendees will learn:

  • Understand the unique challenges presented by testing mobile apps
  • Learn ways to adapt our practices to address mobile testing challenges
  • Discuss an approach for selecting the appropriate mobile devices to test

About the speaker

Lee Barnes
Lee Barnes has over 18 years of experience in the software quality assurance and testing field. He has successfully implemented test automation and performance testing solutions in hundreds of environments across a wide array of industries. He is a recognized thought leader in his field and speaks regularly on related topics. As Founder and CTO of Utopia Solutions, Lee is responsible for the firm’s delivery of software quality solutions which include mobile quality, performance management, and test automation.

How Do Roles Transition In an Agile Environment with Tom Cagley

Transitioning or incorporating agile into a traditional environment requires sorting out who does what and how they do it. Most organizations have roles that have evolved over many years; roles that are tied to status and remuneration. Transitioning to agile changes how many roles are defined. These types of changes need to be part of planning your organizational transformation but first you have to understand the new world order. What happens to the role of a test manager or tester when you team structure requires checking your title at the door. This webinar will explore the agile gravity well which requires blending and synthesizing roles.

  • Agile requires re-mixing roles
  • How to translate traditional roles to agile roles
  • How role changes affect oversight

About the speaker

Tom Cagley
Tom Cagley leads DCG’s Software Process Improvement and Software Measurement Consulting Practices. He has over 20 years of experience in the software industry. Tom has held technical and managerial positions in different industries as a leader in software methods and metrics, quality assurance, and systems analysis. He is a frequent speaker at metrics, quality, and project management conferences. His areas of expertise encompass methods and metrics, quality integration, quality assurance, and the application of the SEI’s CMMI® to achieve process improvements. Tom is the current President of the International Function Point Users Group. He also is an active blogger and podcaster, hosting and editing the Software Process and Measurement Cast.

Lean Development: Achieve Structural Quality & Reduce Rework with Chris Manuel

Lean development techniques continue to garner a lot of attention, but in practice their acceptance has been slow and the results have been mixed. As a development or test manager, there are a set of lean principles you can leverage to help eliminate the largest source of waste in development – defects and the rework effort that results. Chris Manuel will discuss a framework for bringing lean concepts to improve structural quality and reduce defects.

About the speaker

Chris Manuel
Chris Manuel, Worldwide Solutions Director at CAST, has 18 years of IT services experience including full life-cycle development involving project management, development, and testing. He has led consulting and large program deliveries for many Fortune 500 organizations across a variety of industries including Retail, CPG, Manufacturing and High Tech. With expertise in global delivery models, large test program management, testing centers of excellence and software analysis & measurement, he has held leadership positions with global system integrators including Wipro, Fujitsu and CGI (AMS). Chris is PMP, ITIL Practitioner, and ITIL Foundation certified.

Mobile Testing with Michael Yudanin

Mobile testing is no longer a question – it’s a reality. The question is how to approach it. This is precisely the topic of the one-day class offered at TesTrek 2013, Mobile Testing. This free webinar will give a taste of this class by reviewing the subjects it covers and focusing more in-depth on two of them: outlining mobile testing strategy and mobile test automation.
Right strategy is the key for success. As such, it should be both robust enough to cover the existing needs and flexible enough to accommodate emerging challenges. Thus, we will briefly outline the main challenges facing mobile testing and proceed to outlining the components of mobile automation strategy that should answer these challenges.
Mobile test automation is also a must. The pace of change, multiplicity of platforms, criticality of the mobile applications of today – financial, retail, medical, etc. – call for increased effort on this front. Just like the overall mobile testing strategy, mobile test automation needs to be both robust and flexible. This, as we will discover, requires a massive re-thinking of the way we do automation, and perhaps even abandoning some good old approaches for the sake of newer, more flexible ones. This part of the webinar will include not only discussion but also a brief demo of a possible solution.

About the speaker

Michael Yudanin
Michael Yudanin is CEO of Conflair, a QA and testing company. He has been working on automating tests for mobile devices since before mobile apps and smartphones became a commonplace. Michael developed RealMobileTM, a unique patent-pending approach to using common automation tools to automate mobile apps testing. Among the clients who benefited from this approach are Home Depot, Bank of America, Weather Channel, and several other large enterprises. Michael is a regular conference speaker and instructor.

Test Planning Versus Test Strategy: Are They the Same? with Clyneice Chaney

We’ve all heard about strategic planning it’s the preparation for battle plans or achieving goals. People talk about strategies when they want to change something or achieve something. So when we talk about test planning and test strategies are we talking about the same thing? If we have a plan do we need a strategy? If we have a strategy do we need a test plan? In today’s market with the need for leaner, quicker and effective testing what are options to consider with regards to test strategy and test plan documentation. This session discusses test planning and test strategy development and suggests approaches for today’s testers and test managers.

Key learning points

Webinar attendees will learn:

  • Test planning definition and to-dos
  • Test strategy definition and to-dos
  • How to use test strategies as a part of today’s testing
  • Feasible formats for test plan/ strategy documentation

About the speaker

Clyneice Chaney
Clyneice Chaney brings over 20 years of testing, quality assurance, and process improvement experience. Clyneice holds certifications from the American Society for Quality as a Certified Quality Manager, QAI Global Institute’s Certified Quality Analyst, and Project Management Institute’s Professional Project Manager. She has participated as an examiner for state quality awards for Georgia and Virginia. She is currently an instructor for the International Institute for Software Testing and has presented technical papers at the Software Engineering Institute: SEPG Conference, American Society for Quality: Quality Manager’s conference, Quality Assurance Institute International Testing Conference, International Conference on Software Process Improvement and Software Test and Performance Testing Conferences.

Project Management Is Dead, Long Live Project Management with Tom Cagley

Software project management is defined as the activities required to plan and lead software projects. Historically, IT projects have identified a single person to play this role. Many forms of Agile have eschewed the project manager role and instead distribute the activities associated with project management across the core team, including the product owner, the development team and the Scrum Master. In Agile, project management is dead…at least as a single role that leads, directs, controls and administers a project team because those roles are distributed to the team. This webinar provides an overview of how these three competing constituencies can work together to control and shape project execution.

About the speaker

Tom Cagley
Tom Cagley leads DCG’s Software Process Improvement and Software Measurement Consulting Practices. He has over 20 years of experience in the software industry. Tom has held technical and managerial positions in different industries as a leader in software methods and metrics, quality assurance, and systems analysis. He is a frequent speaker at metrics, quality, and project management conferences. His areas of expertise encompass methods and metrics, quality integration, quality assurance, and the application of the SEI’s CMMI® to achieve process improvements. Tom is the current President of the International Function Point Users Group. He also is an active blogger and podcaster, hosting and editing the Software Process and Measurement Cast.

Top Ten Challenges of Test Automation with Bob Crews

Your organization has implemented software test automation and is not realizing the value it expected. Or perhaps, it’s just starting to consider implementation and you wish to be proactive (always a wise decision). The scope and complexities of software testing are increasing as new technologies and environments emerge, applications become more advanced, and users become more astute! Including test automation as part of your strategy can improve your testing process but you’re then faced with numerous considerations. What challenges should we expect and how do we overcome them? If left unaddressed, what impact will these challenges present? Does our software testing team have the necessary skills to be successful? How will we define and measure success? This presentation will address common challenges and provide strategies on how to overcome them in order to develop a strategic roadmap to successful implementation!

Key learning points

Webinar attendees will learn:

  • Ten of the most common real-world challenges encountered by organizations based on the authors 16 years of experience in implementing test automation. The impact of these challenges, along with suggested solutions, will also be addressed.
  • The recommended criteria for performing a Test Automation Assessment to better allow your organization to create an implementation strategy.
  • Ideas to better develop a skill assessment strategy in order to put together and manage an effective test automation team

About the speaker

Bob Crews, President of Checkpoint Technologies, is a consultant and trainer with over 26 years of IT experience including full life-cycle development involving development, requirements management, and software testing. He has consulted and trained for over 240 different organizations in areas such as addressing mobile testing challenges, effectively using automated testing solutions, test planning, risk analysis, implementing automated frameworks, and developing practices which ensure the maximum return-on-investment with automated solutions. Bob has presented at numerous conferences and user groups throughout the world including QAI, EuroStar (Copenhagen), HP Software Universe, and LatinStar (Mexico City). Bob was named as one of the top five speakers at the QAI Annual Software Testing Conference in 2004.

Top Ten Attitudes to Abolish When Adopting Agile with Gil Broza

Many of the new to Agile don’t realize how initially unfamiliar, sometimes even uncomfortable Agile methods are to practise. And so, during the Agile adoption journey they hold on to familiar attitudes that have made sense before. Some of these attitudes are fine in an Agile context, but others – often going unstated – lead to mediocre performance or disaster.
Gil Broza, known for his people-centric, pragmatic approach to Agile transformations, shares in this talk the top 10 attitudes to shake loose – and what to replace them with. These attitudes include managerial ones, such as “assign work to task experts” and team ones, such as “The work is done when everyone has completed their own piece.” Other examples are unhelpful resignation to reality (“Things take long, and that’s just the way it is”) and narrow perspectives (“If it runs, it’s ready”).

Key Learning Points

  • Which hold-over attitudes you have (some of which you probably never noticed or questioned)
  • Their impact on performance
  • Powerful techniques and mindset you can embrace instead (they might not seem easy, but they are within your reach)

About the speaker

Gil Broza
Gil Broza helps software organizations in their adoption and use of agile techniques. He guides them in implementing a reliable, sustainable methodology so they truly delight their customers and make a positive impact. He works at all organizational levels, coaching people in technical, managerial, and leadership behaviors. Gil’s forthcoming book, The Human Side of Agile: How to Help Your Team Deliver, will guide agile team leaders in taking their teams to outstanding performance. Gil has been a regular contributor and coaching stage producer for the agile series of conferences, a sought-after speaker for other industry events and groups, and host of numerous public webinars about agility.

The Essential Product Owner: Partnering with the Team with Bob Galen

The Product Owner (PO) role is arguably the most crucial role within agile teams. Unfortunately, we often hear horror stories about PO’s who are going it alone—who aren’t available to their teams, who change their minds incessantly on business priorities, and who ignore quality requirements and technical debt. Even the best PO’s struggle to meet the many demands of the business while still providing sufficient team guidance.

Bob Galen shares real-world stories where he’s seen “effectively partnered” teams and Product Owners truly deliver balanced value for their business stakeholders. In this webinar he’ll show you how story mapping and release planning can truly set the stage for effective team workflow—establishing a “Big Picture” for everyone to shoot for. How establishing shared goals, both at the iteration and release levels, truly cements the partnership between team and Product Owner. And finally how setting a tempo of regular, focused backlog grooming sessions establishes a mechanism for the team and Product Owner to explore well-nuanced and high value backlogs.

If you are a team or PO struggling to effectively deliver results, you’ll leave with ideas for establishing an ecosystem where the Product Owner and the team drive continuously improving performance.

About the speaker

Bob Galen
Bob Galen is the Director of the Agile Practice at Zenergy. Bob is an in-demand agile adoption coach, trainer, and consultant with over 10 years of agile experience across Software, QA/Test, and Project Management. Bob’s specialty is Agile at-Scale challenges.

The Future of Requirements Definition Management with Bryan Fangman

Delivering quality software begins with clear and agreed to requirements. Industry averages for project rework range from 20–40%, and of that number 70–85% is related to requirements. Working from requirements that are “well formed” early in the project lifecycle will significantly reduce the frequency of defects, and clearly identify testing scope. Visualizations (such as storyboards and interactive simulations) ensure requirements are understood and provide a model for creation (generation) of test cases. If your organization has embraced Agile development methodologies, you must be prepared to find the balance between large sets of formal requirements and less formal Agile user stories. In a fast paced environment – understanding, tracking and assessing impact of changing business needs has never been more important. This presentation will explore some of the challenges affecting requirements definition and offer techniques to meet these challenges with success!

Key learning points

Webinar attendees will learn:

  • How to use a visual approach to requirements definition for better requirements and aligned testing
  • Impacts of Agile on traditional requirements management and testing approaches – the requirements “bell curve”
  • Trends in requirements definition and management

About the speaker

Bryan Fangman
Bryan Fangman is a senior product manager at Borland responsible for Requirements Definition and Management (RDM) strategic planning and software development, product support, marketing and customer relations with over 15 years of experience developing and integrating enterprise software applications. Formerly a systems engineering staff senior with Lockheed Martin, he focused on the alignment and integration of standard engineering processes with application lifecycle management tools supporting the CMMI Level 5 standard. He helped establish the Center of Excellence for Application Development and supported all phases of the application development life cycle including planning, requirements definition, analysis & design, development, testing, deployment and support. He co-presented with Forrester Research on the topic of object-based requirements, has presented to IIBA chapters on requirements visualization, and is currently speaking at executive luncheons on the topic of balancing requirements management and delivery in organizations with both Waterfall and Agile methodologies.

Demystifying SAFe 4.0 | What does it mean for Large Scale Transformation? with Dr. David Rico

The Scaled Agile Framework(SAFe) is a commercial industry systems and software engineering body of knowledge based on lean and agile values, principles, and practices. Lean and agile methods are now used by over 95% of public and private sector organizations worldwide. SAFe is a multi-level model consisting of best practices, guidelines, and tools for enterprise-wide portfolio management, value stream management (mission work flows, threads, and complex acquisitions), program management, and team-level project management. SAFe provides the program management and systems engineering discipline necessary to build complex, enterprise-wide mission and safety-critical systems, while retaining the flexibility, adaptability, and market-responsiveness of lean and agile principles. SAFe is emerging as the de facto standard for Global 500 firms, top U.S. financial institutions, major U.S. defense contractors, and public sector agencies such as the U.S. Department of Defense.

  • Understand and explain how SAFe 4.0 supports Lean & Agile principles.
  • Illustrate how and why SAFe 4.0 supports systems and software engineering.
  • Master the roles, responsibilities, and guidelines of its four major levels.
  • Learn how to apply SAFe 4.0 for portfolio, program, and project management.
  • Identify the business case, justification, and rationale for SAFE 4.0.
  • Internalize SAFe 4.0 principles by engaging in conversation and dialogue.

About the speaker

David Rico
Dr. Rico helps oversee a portfolio of large multi-billion dollar IT projects. He has been a technical leader in support of NASA, U.S. Navy, U.S. Air Force, and U.S. Army for over 30 years. He has led over 20 change initiatives based on Cloud Computing, Lean Thinking, Agile Methods, SOA, Web Services, Six Sigma, FOSS, PMBoK, ISO 9001, CMMI, SW-CMM, Baldrige, TQM, DoDAF, DoD 5000, etc. He specializes in IT investment analysis, portfolio valuation, and organization change. He has been an international keynote speaker, presented at leading industry conferences, written seven textbooks, published numerous articles, is a reviewer for multiple journals, and is a frequent PMI, INCOSE, ALN, and SPIN speaker. He is a Certified PMP, CSEP, ACP, CSM, and SAFe Agilist, and teaches at five Washington, DC-area universities. He holds a B.S. in Computer Science, M.S. in Software Engineering, and D.M. in Information Systems. He has been in the IT field since 1983.

Get Your Agile Testing Up To SPEED! with Clark Cochran

The webinar provides insights into how to:

  • Perform continuous testing & re-testing with push-of-a-button speed
  • Ensure comprehensive coverage during each sprint & with every requirement change
  • Gain complete visibility into what has and has not been tested
  • Reuse your manual test cases, models, and existing requirements for greater efficiency
  • Auto-generate OPTIMIZED regression and progression test suites
  • Comprehensively integrate with your existing testing tools

About the speaker

Clark Cochran
Clark leverages over 30 years of highly successful advanced technology product, service sales, and management experience in both direct and partner sales. He held sales management positions at several early stage start-up companies including VirtualLogix, Mirabilis Design, Coventor, and CoWare. He enjoys bringing customers innovative products that can dramatically improve their own product design and delivery. Clark has also held sales positions at Cadence Design Systems, Oracle, and Hewlett Packard. Prior to sales, he was a design engineer and is a Registered Professional Engineer. Clark graduated with a BSCE degree from the University of Washington.

Make a Difference and use TMMI with Clive Bates

View this webinar to know about:

    • Ways organizations adopt TMMi as the framework to help them make a difference
    • Focus on examples of organizations where TMMi has helped them
    • What organizations did to become better
    • Introduce ideas that will help improve effectiveness and efficiency

About the speaker

Clive Bates
Clive Bates has been in the testing industry for many years. He has a great deal of practical experience in all aspects of testing and has a real passion for the industry. Clive Bates is an Accredited TMMi Lead Assessor and has done a number of TMMi assignments as well as teaching and presenting the process as an accredited trainer for the TMMi Professional certification qualification. He was one of the founding members of the original ISEB Foundation syllabus and have been involved with the ISEB and now the ISTQB testing certifications ever since.

Data Maturity Model Part 1: Why do you need data maturity? with Jeff Gorbell

Attendees will learn what data management maturity means, why it is necessary for their organization, and how they can generally determine where they may be along the continuum of data management maturity.

About the speaker

Jeff Gorbell
Jeff Gorball is a Managing Director at Kingland Systems where he is primarily responsible for the consulting practice around data management maturity supporting Kingland’s clients.
Mr. Gorball introduced the concept of a data management maturity model to the strategic partnership that ultimately developed the DMM Model. He has been on the DMM working group since its inception, becoming the first non -CMMI Institute member to obtain accreditation on the Model by the CMMI Institute to consult on and conduct workshops on the Model.
Jeff Gorball is also certified on the Enterprise Data Management Council’s Data Management Capability Assessment Model referred to as DCAM. Jeff is the first person in the world for that dual certification on the DMM and DCAM models. Mr. Gorball has been asked by the Enterprise Data Management Council to assist in providing training on the DCAM model.
Jeff Gorball has more than 30 years of experience providing mission-critical technical services to high-availability data and communication centers, systems, and facilities. Prior to joining Kingland Systems, Mr. Gorball retired from the Marine Corps after 24 years of active service in the communications field.

Data Maturity Model Part 2: Introduction to Data Management Maturity Models with Jeff Gorbell

Attendees will learn what the DCAM (Data Management Capability Assessment Model) and DMM (Data Management Maturity) Models are and how they can be used

Key Messages/takeaway’s:

  • Understanding the need and nature of maturity models
  • Review of the DCAM model and its content
  • Review of the DMM model and its content
  • Introduction to scoping your use of the models
  • Examples of how organizations have used the model with discussions of different types of organizations, maturity levels and organizational objectives

About the speaker

Jeff Gorbell
Jeff Gorball is a Managing Director at Kingland Systems where he is primarily responsible for the consulting practice around data management maturity supporting Kingland’s clients.
Mr. Gorball introduced the concept of a data management maturity model to the strategic partnership that ultimately developed the DMM Model. He has been on the DMM working group since its inception, becoming the first non -CMMI Institute member to obtain accreditation on the Model by the CMMI Institute to consult on and conduct workshops on the Model.
Jeff Gorball is also certified on the Enterprise Data Management Council’s Data Management Capability Assessment Model referred to as DCAM. Jeff is the first person in the world for that dual certification on the DMM and DCAM models. Mr. Gorball has been asked by the Enterprise Data Management Council to assist in providing training on the DCAM model.
Jeff Gorball has more than 30 years of experience providing mission-critical technical services to high-availability data and communication centers, systems, and facilities. Prior to joining Kingland Systems, Mr. Gorball retired from the Marine Corps after 24 years of active service in the communications field.

Data Maturity Model Part 3: How do you realize the benefits? with Jeff Gorbell

Through examples of case studies, attendees will learn how the Data Management Maturity Model (DMM) has been used and what organizations have achieved, or can achieve, through it.

About the speaker

Jeff Gorbell
Jeff Gorball is a Managing Director at Kingland Systems where he is primarily responsible for the consulting practice around data management maturity supporting Kingland’s clients.
Mr. Gorball introduced the concept of a data management maturity model to the strategic partnership that ultimately developed the DMM Model. He has been on the DMM working group since its inception, becoming the first non -CMMI Institute member to obtain accreditation on the Model by the CMMI Institute to consult on and conduct workshops on the Model.
Jeff Gorball is also certified on the Enterprise Data Management Council’s Data Management Capability Assessment Model referred to as DCAM. Jeff is the first person in the world for that dual certification on the DMM and DCAM models. Mr. Gorball has been asked by the Enterprise Data Management Council to assist in providing training on the DCAM model.
Jeff Gorball has more than 30 years of experience providing mission-critical technical services to high-availability data and communication centers, systems, and facilities. Prior to joining Kingland Systems, Mr. Gorball retired from the Marine Corps after 24 years of active service in the communications field.