DreamOps Walk Through
Let's take a brisk stroll through a DreamOps software delivery lifecycle.
- Making Changes - First, we put our right foot forward by specifying the changes to be made in a well-designed development environment.
- Validating Changes - The changes are then checked by another team member, to be sure that the change is done right, and that it's doing the right thing.
- Distributing Changes - Once verified and validated, changes can be distributed to customer environments.
New changes are specified with multiple subscribers in mind, including a mock customer, and provide sample data when appropriate. Admins and Coders can provision a development trial instance by asking Hubot in a HipChat window, or by completing a form on a secure Visualforce page.
Development environments are pre-processed (via proxy signup) and arrive ready-to-use. Developers just sign-in-and-go using a preset password. IDEs can be attached directly to the org without cloning the Git repository locally. Any changes made to the development environment can be added to an unmanaged package in the org, and automatically committed back to version control without special knowledge of Git or Bitbucket, including changes to the mock sample data.
The package's developer API is documented with ApexDocs for Global classes and Octopus for other shared components. Both Help for this Page and the Confluence Help site are updated each sprint to cover tasks, topics, and references meaningful to end-users. New regression tests for Global methods are created as part of the development task, and UI changes include modifications to the Selenium Nightwatch scripts. Issues found by tests are posted to cloud-based logging systems, just like production errors and messages.
Once a pull request is ordered, via ChatOps or a Visualforce page, changes are first verified through static analysis, and both Apex test coverage and Apex test throughout are confirmed. Selenium Nightwatch tests confirm that the user interface works as expected. Peer developers review and test all changes before code and sample data is merged back to version control.
The full suite of tests automatically run against the merged changes, to be sure everything still works with other recent changes. Failed builds are raised up for handling by the developer responsible for the last change.
A build uploads a new version at the end of each sprint, verifies the version, and then submits the version for end-to-end testing.
Managed packages are kept in a state of continual release readiness. The latest work increment, including help topic updates, is examined at the close of each sprint, to determine whether the version is ready-for-primetime.
The examination includes both regression and acceptance tests, in automated and exploratory form. Apex tests can make configuration changes during a test run to confirm features work as expected when enabled or disabled. Support can enable or disable the same optional features on the customer's behalf through a secure Apex web service. Exploratory testing includes reviewing updated help topics, and testing like a customer.
All error events, whether in test or production, are posted to cloud-based logging systems. The package also posts any feature configuration changes to our audit log, whether by an administrator or our web access point.
Using a secure Salesforce app in your business org, any certified version can be pushed to customer staging or production orgs, individually or in groups, typically with the latest features disabled, and other needed updates automatically applied and logged. User access to new components is managed through packaged permission sets and a stub profile.
Usage metrics and logs are continually scanned to detect statistically significant variations in error volume, creating a "production immune system". Subscriber Apex tests are run periodically to detect any new failures.
Each customer can enable new features when it is ready, or not at all. The package feature adoption dashboard is generated from analysis of the usage metrics and configuration audit log. Based on adoption feedback and other inputs, additional changes can be specified, closing the loop.
To assist with implementing the package for new customers, a Jumpstart trial is maintained with the latest GA versions. Support is also provided for maintaining implementation source under version control, using a separate developer sandbox for each task, populated with a subset of production data.