• Welcome
  • Philosophy
  • Services
  • Bio and Résumé
  • Contact

Bio And Employment Information for Jason LeCount

Hi. My name is Jason LeCount. I've been doing test automation for years now, on Unix/Mac, Android, Linux and Windows.
I specialize in improving the testability of applications, as well as writing automated test frameworks and test suites for web applications. I'm a big fan of agile methodologies and functional languages.

I love testing -- especially using tools and automation to test smarter, not harder. Take a look at the links on the left and let me know if I might be of service to you.



The Team Must Prove It!

How Do You Really Know It Works?

Applications are complex. A "Best guess" on whether code works is almost always insufficient. Any team member who touches the application -- from within Product Management to Development to Operations to QA -- must be able to answer the question: 'How is this provably correct?'

A product manager will be ensuring that the feature makes sense in the context of the whole application functionality and ideally will be working with Development and QA to consider technical design factors, testability and feature limitations.

A developer should be able to demonstrate that code works via a unit test. Those tests should be running on a Continuous Integration server.

A QA engineer should have repeatable tests, running regularly in a regression suite, that prove that end-to-end user functionality works. There must be a testing strategy for negative tests. Happy path testing does not suffice for happy customers.

Operations should be able to tell when a feature is not working, and understand what actions to take should that occur.

Testing Should Mirror the Layers of an Onion

UI Tests Are Not Enough!

Selenium is great. I write a lot of selenium tests, but they represent merely one gauge of application correctness.

Given that a web application is comprised of many layers -- from the deployment process, to a load balancer, to the web container, to the database, to the services layer, to Javascript, to the HTML presentation -- testing must also reflect these layers.

There are three primary reasons we should be thinking of testing in layers.

First, it's easier. In fact, in many cases, testing all your scenarios will be impossible if your browser is your only tool. If you tout fault tolerance as a major application feature, yet cannot simulate faults from a browser, your testing strategy is failing.

Second, it's often faster. While UI tests do give you a good bang-for-the-buck by hitting your entire stack vertically, they are slow. Launching a browser, waiting for network IO, etc. introduce serious latency in a test suite once you have a significant number of tests.

If you can prove your assertion without launching the browser, do it!

Third, it's sometimes just the right level of abstraction. While a developer is adding new methods to a class, it's quick and easy to add unit tests. Depending on UI tests to catch bugs in your sort algorithm is extremely inefficient, and your QA group will never scale to adopt this level of testing atop what's already on their plate.

Automation Doesn't Replace Eyeballs

Human Users Are Still Vital During Testing

So let's now say we've got our QA group, developers, operations team and PM all thinking about product quality. QA has written service tests that check error conditions and correct behavior, and has also gotten UI automation validating the end-to-end behavior in a browser. Development has become totally test-infected and are writing jUnit / TestNG / rspec tests like it's going out of style. Are we done? Have we proven that our application is correct?

Probably not.

Unless your application lacks a visual component, most of these tests can't tell much about user experience. For instance, perhaps:

  •   DOM elements render badly
  •   A developer left in debug code
  •   The workflow you've coded is horrible to use
Manual testing can find these issues trivially, while finding them with automation is extremely hard.

Automate the tests that humans find difficult, tedious, or time-consuming.

Manually test that which is either very easy for a human, infrequently needed, or time-consuming or difficult to automate.


I offer a variety of services as an independent contractor. I can devise an approach to test a tricky bit of functionality, help fix problems with your test automation, get you started if you're just beginning your automation effort, or work with a QA team to come up to speed on the skills to succeed with automated testing.

Bootstrapping Your Testing

Want to start but don't have anything yet?

Automated testing requires a significant setup effort -- from servers (in-house or in the cloud), to software installation, to framework creation, to Continuous Integration setup...there's a lot to do before you write your first test!

Let me help get you testing, by handling server setup -- both hardware and software, framework setup, and automated test result reporting.

Your time to begin writing good, reliable tests is greatly reduced, you have great testcase visibility, and both your testers and clients sleep better at night.

Writing Custom Test Frameworks

Make automation easy -- start with a good framework!

A solid framework (whether a UI framework or not) provides structure and services to make your tests reliable, maintainable and easy to write.

A common mistake is to start test automation without a framework, record a bunch of tests, then eventually abandon the effort once they become a maintenance nightmare.

A well-designed framework provides a solid structure that encourages code reuse and allows your tests to be succinct and easy to maintain.

A good framework may also, depending on the type of testing, help manage and feed data to tests, test on a variety of browsers seemlessly, execute tests in parallel, and ensure that test results are easily available.

Based on your needs, I can advise upon and implement a testing framework for you that will allow your test automation to scale, provide the visibility you need, and show off what automated testing can do!

Writing Custom Test Suites

Got code you don't know how to test?

Sometimes testing is hard. Using the wrong tools for the job, or a unsuitable test approach guarantees it. And even if that's right, sometimes it's still just hard.

Let me figure out how to test your "untestable" feature. I'll work with Development and QA to figure out how to get the test results you need now, and establish a regression suite to make sure working features stay that way.

Training Your Team

Success requires the right skills

Test automation is coding. Good automated tests are written with good code. A good test badly coded can be a weak link in your automation suite.

Let me help bring your team up to speed with the latest techniques and best-practices in test automation. I can either offer training sessions on a variety of topics, or one on one mentoring / pair-programming.

Some of the topics I can work with your team on:

  • Best Practices When Writing Automated Tests
  • Learning Selenium
  • UI Test Framework Design and Best Practices
  • Advanced Test Automation Techniques
  • Writing Tests in Ruby -- How and Why
  • Java Skills
  • Unix Skills

My Résumé

I grew up in the San Francisco Bay Area, and began software testing at Pyramid Technology in 1989, back when minicomputers still existed. I realized that software testing often involved repetition, huge matrices of data, and an unwavering attention to detail. Since computers are intrinsically better suited to dealing with all of the above factors, I began looking at ways to automate my testing.

In 1993, I decided to move to New York. Along the way, I almost lost a cat in Nebraska and did break up with my girlfriend around New Jersey. I settled in Westchester County, NY, then spent the next 8 years testing financial applications for major Manhattan investment firms -- primarily on Sun boxes running Solaris and X/Windows. The apps ranged from equities to fixed-income to currency trading systems.

In 2000, I returned to California and met my wife-to-be. We were married in 2003.

Much of my experience since 1999 has been testing J2EE applications. Since 2000, I've worked at a variety of startups, one of which was acquired by Cisco, where I stayed for four years. As of 2017, my languages of choice are Python, Ruby and Scala. I'm currently learning Clojure and am a definite fan of functional languages.

Sungevity -- Test Architect

June 2014 / March 2017

  • Led team of 5 people focused on automated testing, and test infrastructure at Sungevity.
  • Mentored and did training sessions for entire QA team (18 people) on Webdriver, automated testing, REST API testing, docker, git, Python and Scala.
  • Wrote Webdriver-based UI automation framework in python. Features included Salesforce integration, page objects, transparent support for single-page javascript apps, cross-browser, and mobile support. Test results persistence is automated and uses TestRail as a system of record.
  • Wrote requests-based API testing framework in python.
  • Wrote scala library for functional test utilities of the Sungevity back-end. Features included functional test support with the same semantics as the Play controller unit/integration tests, Siren response parsing, Testrail test results integration, JSON request construction.
  • Wrote Hipchat integration for automated post-release validation test result notification into Hipchat rooms
  • Wrote and executed load tests using Locust python library.
  • Wrote a python utility for automatic generation of test users used by Dev and QA for internal testing.
  • Implemented docker-based Jenkins stack in which master and slaves are docker containers. All changes to Jenkins configuration (master and jobs) are encrypted (in cases of sensitive data) and stored in GitHub repo.
  • Implemented docker-based tooling to provide a consistent local (OSX / Windows) test environment regardless of host OS. Implemented with a mixture of bash and python.
  • Maintained Jenkins server and jobs creation and updates.
  • As part of a hack day, wrote a Javascript app, which became an internal tool that simplified test triage by providing a dashboard view of failing tests.
  • Wrote python tree-based test framework for test cases structured around a model of tree traversal. Nodes represented application states and edges represented actions to transition between states. Test setup consisted of declaring rules around how the tree may be traversed, then the framework walked all valid paths through the application-under-test given such constraints.
  • Organized and ran a 6 week Haskell lunchtime study group.
  • Off hours: completed successfully the following Coursera classes in Scala: Functional Program Design in Scala, Parallel Programming, and Principles of Reactive Programming.
  • Currently learning Clojure.

Okta - Senior Software Engineer in Test, Internal Build and Tools Engineer

June 2013 / June 2014

  • Wrote a node.js / Firebase-backed server to provide real-time view data of Okta's Jenkins CI pipeline
  • Wrote node.js scripts to collect historical metrics on Jenkins build jobs.
  • Wrote a Jenkins plugin in Java to do setup and teardown on Jenkins slaves
  • A part of a team of three, enhanced and maintained a large (120 slave, 2000 builds/day) Jenkins EC2-based build/test infrastructure.
  • Ported a large (4000 test) test suite from JUnit to TestNG.
  • Converted a large set of Jenkins jobs with inline job definitions to a bash-based job framework.
  • Wrote Ruby scripts to help automate triage of failing tests.
  • Maintenance of a Python Gearman-based Jenkins job queue.
  • Owned and extended a Ruby-based tool to set up Engineer's laptops upon joining Okta and automate periodic laptop updates of various tools (e.g. new git versions, new internal dependencies such as memcached, etc.)

Netpulse - QA Architect

October 2011 / Present

Responsible for improving testability of all Netpulse application components, as well as growing an Agile and test-driven Engineering culture. My primary goal is to ensure that all application components are easily testable, and that staff (Dev, QA) understand how to write unit and functional tests and that impediments to doing so are removed.

Android Team:

  • Wrote and maintained all build / CI / release automation with ant and Jenkins
  • Integrated Guice (IoC framework) and Robolectric with our application.
  • Mentored Dev team in test-driven development including topics such as testable design, dependency injection, unit vs. integration testing, etc.
  • Trained team in transitioning to git from svn.
  • Wrote POJO and Robolectric tests.
  • Wrote functional test framework with monkeyrunner and added longevity tests and graphed 'top offenders' -- an aggregate list of exceptions caught from the prior monkeyrunner run.

Qt (legacy client) Team:

  • Added functional UI automation tests via FrogLogic's Squish tool

Legacy Server Team:

  • Wrote a Ruby based tool for HTTP API testing
  • Mined production logs to generate a regression suite of 2500 distinct inputs and correct XML responses for existing legacy server functionality for the purpose of testing backwards compatibility of new server.

New Server Team:

  • Added functional tests for REST API via Ruby framework.
  • Began work in Scala to automate build with SBT and add tests with specs2. Scala server port was cancelled due to management change so,sadly, this project never took off.
  • Wrote selenium-webdriver-based UI automation of web portal.
  • Performed load testing with Tsung. Graphed and published run results as well as historical trends.

Cisco Systems - Senior Automation QA Lead

February 2007 / July 2011

Was responsible for leading team of 3 automation engineers in an Agile (Scrum) team of developers, product managers, QA engineers, and OPS engineers. Was responsible for automation tool choice, implementation and design of test frameworks, working with junior engineers to build out suites, and general oversight of junior team members. Wrote Selenium-based UI test framework to automate UI regression of Cisco Eos social network. Test framework has the following features:

  • Distributed and parallel test execution across multiple selenium servers.
  • Fast, multi-browser execution using CSS selectors.
  • Test result bridge to HP QualityCenter, which allows real-time integration with black-box test inventory and results.
  • Allows black-box team to execute automated suites on their own machines, then push results back to QualityCenter. This has saved tremendous manual test time.
  • TestNG-based test API allows for flexible and powerful tests.
  • results stored in mysql database to enable analysis of historical test run data.
  • With one other automation team member, wrote 1500 testcases for the Eos social network.
Wrote an ETL (extract, transform and load) regression test framework and suite for testing historical data. Features include:
  • Execution of real user activity on QA, Stage and Production on hourly basis.
  • All cookies and contextual information from test user activity is written to an audit database for subsequent validation.
  • Validation suite compares stored test metadata with ETL historical information via SQL test suite. Data comparison framework allows very simple data-driven testcases to be written. Current suite validates 3000 data points.
Led initiative to transition CMSG group off of Subversion and onto git. Gave presentations, trained engineers and was an overall advocate for its adoption.

Fiveacross.com - Senior QA and Build Engineer

October 2006 / February 2007

Developed automated deployment scripts for QA, Stage and Production using Capistrano. Developed rspec tests for a Ruby on Rails social network that hosted sites for NASCAR, the NHL and Televisa. Tested and deployed infrastructure updates for all environments.

Northstar Systems International / Senior Tools and Build Engineer

February 2003 / October 2006

Implemented scripts to do client deployment (appserver configuration, deployment, database creation and population.)

Implemented automated build / deploy / test infrastructure for internal Dev / QA use. Infrastructure is responsible for cross-platform (Linux and Win XP) continuous and daily builds, unit test execution and website update, javadoc updates, emailing build failures and unit test results and automated creation of TeamTrack defects from unit test failures.

Trained Indian QA team in New Delhi on the Northstar API and unit-testing best practices.

Co-wrote and maintained the NorthStar ant build. Build integrates with eclipse and determines build dependencies via Ecilpse project dependencies.

Code-generated junit testcases using Velocity to test CRUD methods on data objects. Implemented a variety of other testcases using junit / Cactus to test business logic.

Migrated current junit test framework to hibernate/spring-based framework, allowing testing of application services without requiring an appserver.

Implemented Development and Customer Support bug tracking workflows using Serena TeamTrack.. Set up end-to-end security through firewall via an Apache proxy that performs an additional layer of authentication for external users.

Performed various release control duties (branch management, label creation, integrations / reverse integrations) using Perforce.

Configured and managed ASP monitoring scripts with nagios.

E2Open.com - Senior Software QA Engineer

June 2002 / November 2002

Responsible for developing automation infrastructure for QA group. Designed and implemented an XML content generation program in Java for generating data conformant to a specific DTD. Wrote an extensible data mapping validation tool in Java for use by other QA engineers. Implemented a junit-based testing framework for QA whitebox testing. Performed content validation on an XML transformation engine. Wrote a distributed messaging component using log4j to provide communication between a server application and a junit testcase. Wrote a variety of bash/Python utilities to automate system level testing.

Kenamea Inc - Senior Software QA Engineer

July 2001 / February 2002

Wrote automated testcases in Java using Macaca (a jUnit-like testing framework) to unit test a JMS messaging implementation. Wrote system-level tests to realistically test multiple users sending to/from JMS queues and topics.

Wrote automated testcases for a transaction processing system for Kenamea's message switch product. Testing verified that transactions were durable, isolated, could be committed and rolled-back in a variety of situations. Designed a multi-machine environment to perform system crash testing to verify that uncommitted transactions could be rolled back with no lost messages / corrupted state.

Designed and coded modules to support testcase / test results entry into an Oracle database with a web-based front-end. Scope of work included designing database schema, coding stored procedures, coding Java classes/JDBC access layer and JSPs to display the testcase / results data.

Automated existing performance testing harness to run on its own and do diagnostics of common types of failures (hung server, etc.) and to add testcase results to the database at the end of the test run. Finished system automatically installed server-side and client-side components when new builds were available, ran performance tests, and inserted results into database (see above.) Thus, up-to-date performance results were always available company-wide via web-based interface.

Instinet - Independent QA Contractor

January 1999 / August 2000

Designed and coded a load-testing application in Java to create and manage test clients for an EJB-based Fixed Income global trading application. Load testing application was distributed world-wide (New York, London, Frankfurt and Paris) and used to simulate up to 2000 simultaneous users. Analyzed results from test output to identify performance bottlenecks.

Designed and coded a Java test suite for the purpose of regression testing the Fixed Income trading system back-end.

Designed and implemented a GUI regression test suite in Silk. Test suite automated the testing of all major GUI features and was used to automate Smoke testing of new builds of the Fixed Income system.

Reuters - Independent QA Contractor

October 1998 / January 1999

Designed and implemented a regression test suite for a Java-based bond trading application. Wrote extensions to the application in Java to enhance testability of application. Trained personnel in automated testing methodologies and 4Test. Wrote extensive documentation (~100 pages) on the process of automated testing, as well as best practices for implementation using Silk.

Merrill Lynch - Independent QA Contractor

October 1997 / July 1998

Designed and implemented a regression test suite for the purpose of testing a Solaris-based Foreign Exchange trading system. Features include testing of ticket entry, correct blotter behavior (activity, credit exposure, positions, etc.), start of day, and end of day. Coded in C++ tests for the validation of real-time messages between applications across a network. Wrote a Tcl/Tk script for the Systems group to launch different versions of the FX trading application from a GUI interface.

Goldman Sachs - Independent QA Contractor

November 1996 / September 1997

Designed, and implemented a generic object library in 4Test for testing applications written with a proprietary API. Testing objects are configured via ini files, and are instantiated dynamically for greater flexibility. Object library is extremely flexible, and provides a novice QA Partner user with an interface which encapsulates application logic within objects.

Designed, coded and implemented a test suite with QA Partner / QA Organizer to validate a client/server equities order entry system. Order system is distributed among four client machines (both Solaris and Windows clients), including Sales, Trader, Floor, and SIAC, with a Sybase database server back-end. Test suite simulates order entry, execution, and messaging between all front-end machines, and performs GUI validation as well as database validation.

Coded a C++ real-time price provider for a SIAC simulator, for the purpose of accurately simulating automated order handling. Price classes were implemented with Rogue Wave Tools.h++.

Wall Street Systems - Independent QA Contractor

November 1996 / December 1996

Managed project to validate GUI functionality of a Futures trading system comprised of over 600 screens. Provided an easily configurable test suite which can be easily extended, and has support for languages other than English.

Dow Jones Telerate - Independent QA Contractor

July 1996 / October 1996

Designed and coded utilities to test market feed distribution software. Utilities were written in C++, Perl, and Korn shell. Rewrote Dow Jones' existing tests to increase efficiency. Testing utilities included software to translate data formats, and compare log data from test runs.

AIG Trading Corporation - QA Contractor for Coopers & Lybrand

September 1995 / July 1996

Designed and coded a C++ test suite to unit-test API functions for a Foreign Exchange trading application. API functions interfaced with an Oracle database to manipulate trade and currency position data. Aspects of validation included functional, stress and efficiency.

Designed, coded, and maintained a test suite using QA Partner to perform unit, integration, and regression testing of an entire Foreign Exchange trading system. All phases of foreign exchange trading were modeled, from front-office trading to back-office confirmation and instruction. Aspects of validation included functional, stress, efficiency, and integration.

Trained personnel in the 4Test language and use of its IDE, QA Partner.

Smith Barney - QA Contractor for Coopers & Lybrand

July 1994 / July 1995

Designed and implemented a test environment for a 1,600 turret trading floor as part of a team: wrote test plans, advised client staff on hardware and software needs, wrote test specification documents, designed, developed and analyzed test suites written in XRunner (TSL), C, Perl, Korn and C shell. Gave seminars on developing automation tests and the use of automation test suites to improve software quality. Automation tests written included NIS+ and NFS stress tests, IBM AS/400 stress tests, application functionality, application interoperability, Market Data, and Foreign Exchange trade simulation. The automation test environment created was able to realistically simulate and control all aspects of trading on a 1600-turret trading floor from one workstation.

In C, added a Motif front-end to a TCP/IP based workstation control application. Application was used to control all aspects of automation testing in extremely large networks.

Chemical Bank - QA Contractor for Coopers & Lybrand

July 1993 / June 1994

Designed and coded a test suite to perform automated testing on a 400 turret trading floor. Wrote tests to verify network functionality and bandwidth, application functionality and resiliency to system stress, and system integration.

Maintained a Perl application which was responsible for remote test execution on 150 machines. As a result of this enhancement to the application, communication with its clients, overall performance increased 200%.

Pyramid Technology - Senior Software QA Engineer

July 1990 / July 1993

Designed and coded test software for two Ethernet device drivers, disk and tape device drivers, and compilers, including C, C++, COBOL, FORTRAN 77, and Pascal. Test development included writing software test plans, writing technical specifications, developing test suites, executing test suites, and maintaining test software.

Developed an Ethernet test suite to cover the user, streams, and protocol layer (TCP and UDP) levels. Executed test suite against a single-interface and a dual-interface Ethernet device driver.

Wrote and maintained a test automation harness in C++ to control execution and error reporting of tests distributed throughout a network.

Designed and implemented a utility to automatically back up and restore critical data to and from a network.

Represented Pyramid Technology as liaison to Olivetti Corp., ensuring that their UNIX port was compliant with various conformance test standards. Trained personnel there in use of the X/Open Conformance Test Suite. Responsible for running and analyzing conformance test suites, including POSIX, X/Open, System V Verification Suite (SVVS), and gABI (Generic Application Binary Interface) tests. Approved all operating system releases based on results of conformance testing.

Stanford Linear Accelerator Center - Programmer / Analyst

October 1988 / August 1989

Designed and implemented an electronics schematic wirelist translation program. Modified and maintained a voicemail/telephone paging system to route phone communication building-wide.

California State University Chico

June 1989/ June 1990

Coursework included Pascal, Ada, C, and data structure theory.

De Anza Community College

June 1991 / January 1993

Coursework included C++, SVR4 Kernel programming, and TCP/IP network programming.

My focus areas are around test automation and build and release tools. Any skillset that is no longer current is not mentioned here.

If I didn't write the tool, I don't think I have 5 star knowledge of it.


4 years experience writing frameworks with Selenium as well as writing tests using it.


years of almost exclusive work coding in Java and working on J2EE applications.


My preferred language, though I have not used it as much professionally in a few years.


Been using Unix since 1989 and am a bash die-hard.


A version control system to love. Used for 3 years with git-svn.


4 years of work in Groovy, implementing our Selenium framework with it.


My test framework of choice in the J2EE world. My selenium framework used TestNG as its execution engine.


Still my go-to tool for build scripts.


Solid mysql skills.

Selenium 2

Learning selenium 2 now in order to transition next frameworks over to it.

Most of my work for investment banks was on a contract basis. Since 2001, my experience has been as a full-time employee.

List of Clients and Employers

  • Sungevity
  • Okta
  • Netpulse
  • Cisco Systems
  • fiveacross.com
  • Northstar Systems
  • E2Open.com
  • Kenamea, Inc.
  • Instinet Inc.
  • Reuters
  • Merrill Lynch
  • Goldman Sachs
  • Wall Street Systems
  • Dow Jones Telerate
  • AIG Trading Corporation
  • Smith Barney
  • Chemical Bank
  • Pyramid Technology
  • Stanford Linear Accelerator Center

Steve Atherton

VP of Engineering, Sungevity

He is without doubt, one of the best QA automation Engineers I've worked with over the years.

He has a tremendously broad technical base from which he draws from, he has a great ability to think through very complex problems and design (and implement) very elegant solutions. He works well within his team, providing clear technical direction when needed and is an awesome team player - mentoring, helping, guiding when required.

He's super dedicated to his craft, always goes the extra mile for the team and always delivers. He is widely respected by all of the QA team, and in fact, all of Software Engineering look to him for help, advice and guidance all the time.

He's a true visionary, a great technical lead and an amazing individual contributor. Any organization would be lucky to have Jason on their team.

Andy Hull

Platform Architect, Sungevity

Ever-willing to explore new tools and technologies to solve our quality challenges, Jason has been a critical counterpart in building our microservices architecture from our first exploratory deployments to a fully-fledged production AWS environment.

During my time working with Jason, I greatly enjoyed his partnership and expertise, particularly in his continuous search for quality improvements and expanded test automation coverage.

Jason is also a talented programmer who seamlessly swaps between well-crafted Python and a masterful command of functional and type-safe programming in Scala. He personally developed high-quality test support libraries and effectively mentored QA Automation Engineers in Python and Scala.

His leadership was a vital component in hammering-out a rocky, mistake-prone release process into a highly-predictable operation through comprehensive test automation and transparent status reporting.

His focus on results helped to improve software team morale by minimizing the need for release-night hot fixes and significantly reducing the overall release duration, much to the relief of frazzled software team members.
He willingness to identify, own and most importantly resolve issues in the build/test/release pipeline was a great asset to the organization.

I recommend Jason to any team tackling the challenges of delivering complex software products through a modern CI/CD pipeline to the highest-degree of quality and predictability. I would relish any opportunity any opportunity to work with him again.

Dave Arguelles

Test Engineering Manager, Cisco Systems

Jason's passion for architecting test engineering systems is infectious. As the automation lead for my team at CMSG, he drove all aspects of the automation platform from the ground up. When it came time to expand the platform to include multiple toolsets, Jason dove in, helping to create a highly stable bridge between proprietary and open source frameworks which helped us meet our overall QA goals of expanding automation inventory. His ability to bootstrap a framework in a short period of time with little direction proved to be incredibly valuable in rapidly including multiple test assets into a common inventory and reporting system.

Chi Hoang

Data Architect, Cisco Systems

Jason is one of the most skilled software engineers I've worked with. He is extremely passionate about testing automation, and he leverages his excellent software engineering skills to develop automation frameworks for various parts of the tech stack from front-end UI to back-end services. I would love to partner with Jason again.

Dan Scheinman

Senior VP and General Manager, Cisco Systems

Jason is a great human being and has had a variety of roles for CMSG. Jason really thrived in his position as automation lead. He was the chief advocate of automation inside CMSG QA. We went from low automated test coverage to 95% and a lot of credit goes to Jason. Jason is very smart and loves to keep learning about things and pushing the envelope.

Bala Tellakula

QA Automation Engineer, Cisco Systems

I worked with Jason for almost 2 years now and all through, he showed tremendous ability and technical understanding in his work. He is always on top of trying new things and approaches to make the Automation Testing more efficient, reliable and accessible. I highly recommend him for any job and would love to work with him again in the future.

Mike Bosch

Principal Software Engineer, Northstar Systems

Jason is one of the most dedicated co-workers I've ever worked with. He's extremely conscientious about doing the right thing and will do what it takes to get it done. Jason is always looking for better ways to do things and stays on top of new trends to see how they can help solve problems.

Contact information available upon request.

Contact Information

Feel free to contact me below if I may be of service to you. I am based in the San Francisco area. For short contracts, I may be willing to travel if you're not local to the San Francisco Bay Area.



Message *

Please wait while my tweets load...
Follow jlecount