August 2012 - July 2019 (6 month contracted extended to 7 years)
iaAnyware had built a winforms ASP.NET 3.5 Desktop application to provide line of business applications for brokers to manage clients, quotes, policies, and claims information.
I introduced web development, and streamlined development processes, full lifecycle development from conception to development, to training web development concepts, architecture, infrastructure, ci/cd. Effectively DevOps And Software Architecture, along with backend and frontend development.
From 2012 to about 2015 I created web libraries and architecture for their Web based version, I created all the libraries, repositories in Git, build and deployments with MsBuild scripts and batch files. And trained developers basic html,css,js. From there other developers became involved in the web development, and than as I migrated the ASP.NET Framework code to .NET Core, the web development became the main focus.
The initial application was a giant monolithic desktop winforms application, with other desktop apps for administration and other features… and soap web services to serve data. These applications were targeted to insurance brokers, and not end clients.
The new web architecture consisted of splitting components down into respective microservices, and creating Web API’s suited for each part. Initially ASP.NET Framework, and later migrated to ASP.NET Core
Along with this, iaAnyware desired an online quoting application for new customers, and a client portal specifically for clients of the brokers to self-serve.
I prototyped the first iteration of the iAdviser web application in ASP.NET MVC around 2013 to 2015. This was full stack development backend/frontend and so on. I approached it from a domain-driven design using best practices at the time.
Implemented repository-service patterns with SOLID principles where possible. LINQ to EF, Mapping.
With shared libraries I initially setup a hosted nuget server on ia’s VM in the datacentre, and later migrated nuget projects to MyGet. Along with scripting processes to simplify nuget package creation.
One of the big challenges with implementing Entity Framework was subtle differences (extra fields, and tables) in the core databases across Australia and New Zealand, this was a database with ~800 tables in Sql Server… now with EF initially this meant having 2 different contexts to work across both, and using some funky magic to apply these different contexts across regions. This was a big time-sink.
So I recommended to consolidate Australian And New Zealand databases to match the same structure which would prove to be a big winner. The development manager did the actual database consolidation, and I updated all the web based projects.
Over a period of approx 9 months and the implementation by a colleague, we re-architected how their report generation was done. Previously using Crystal Reports in the monolithic Desktop app, with a lot of SQL SPROCS to generate reports, done completely in the Winforms app.
The new approach I recommended was to generate these complex reports entirely server side… html, csv, pdf, everything. Data would be sent up to the server as a json to specify criteria for report generation, it would return a generation “guid” and be processed via another service (Azure WebJob), the frontends could poll the api’s for current generation status, details etc.
I advised architectural design, did code reviews, refactoring, and assisted as needed for this process.
There were several iterations in order to complete the process to suit all scenarios.
Now this was a project done before the introduction of .NET Core and using EF6.0 .Net Framework, and the biggest issue with migrating to .NET Core at this stage was updating the hundreds of EF LINQ queries to work properly with it.Instead I wrote some wrappers to use .NET Core’s configuration so at least we could share the same configuration source as all the new .NET Core projects.
With the new Web layers, there were multiple API’s to serve different requests, so I created an Authentication microservice API to generate JSON Web Tokens that all the API’s would consume.
These tokens would contain the build environment that the token is configured for, this build environment would than setup the Dependency Injections and configuration for the connections for that specific environment per request:
For example, “local”, “dev”, or “stage” stored in token, when request made than connection factories would be applied so when the D.I contexts and libraries get injected into the constructors, it would suit the correct environment. Simpler than it sounds
I initially implemented the Azure storage libraries, for azure queues, blobs, file shares. Than created wrappers to abstract the blob file stores, and other azure services.
Created .NET Core Config Source Providers to use Azure File Shares as a multiple build environment config loader. This meant that the json config files for all environments (local,dev,stage,live…etc there were more) can be stored in a central location on azure files, instead of within each API or Console application. CI/CD was also setup for the config repositories to update the Azure file shares when config files were updated. Than it was a simple restart on the apps to refresh configs.
Yeah, batch files, powershell scripts, MSBuild builds and deployments, later migrated to VSTS (Azure DevOps).
Here I created the VSTS (now Azure DevOps) company login, set up the backend and frontend builds and releases.
Test all reports via sample json queries, with polling for u.i updates as generation is occuring on the server.
This also consisted of backend model projections to/from the database to a clean structured set of viewmodels for the frontend, and api’s to handle querying, and updating of responses.
On the legacy desktop application, there was no unit testing. So on new web apps implemented unit testing with Xunit, TDD where possible.
Also created a large variety of linqpad scripts to help with scratchpad experiments, and even some simple tooling