Archive for the ‘tips and tricks’ Category

TechTip: Installing Airtel Uganda Huawei E3131 on MacOS Sierra

If you are like me, there comes a time when you need to whip out old tech tools to solve a need. In my case it was Internet access in Hoima, which has MTN and Airtel as useable networks.

I have an old modem, which intially had a data simcard gone bad (that is a story for another day) however it was discontinued in 2012, so there were no drivers for Mac. The installation package terminates with errors so I was stumped.

Step 1 was finding the model which involved opening up the modem as below

img_20170725_110050.jpg

Step 2 was to find an installation package, which after about an hour of Googling and reading involved using an Huawei Mobile Partner (https://www.dropbox.com/s/v33lsoe7qok0zsl/MobilePartner-MAC-V200R003B015D16SP00C983.zip?dl=0)

Once installed you can now use your modem readily. Hope this saves someone else some pain…

Software Delivery Project Setup and Engineering Checklist

I have been a part of a number of software delivery projects, note emphasis on delivery not development, projects over the years and I thought I would share my checklist for projects as well as myths that need to be removed from teams for success.

An interesting quote from a delivery manager I have worked with and admire “We need to remove the notion that software products are successful because of hero developers, it is teams that consistently produce quality software”.

With teams and the fact that not everything can be gotten in place from day 1 of the project here is what I think

Must Have

These practices/processes must be in place on day 1 otherwise you run the risk of paying for them later

  1. Version control process – I have seen many a project with a version control system, but without a process for managing how the developers in the team commit code for features. I personally recommend
    • Trunk based development, master always has the latest version of working code
    • Version branches: when there are major version changes maintain a separate line for bug fixes. However this leads to overhead for back-porting (from master to the version branch) or fore-porting from the version branch to master, so must be used very carefully
    • Pull Requests for code review and feature tracking – each developer must have their own version of the code, and issue pull request for code review and merging.
    • Each developer works on feature in its own branch so does not slow down during code review cycles
    • Regular developer commits
    • Pull request guides I like this one from OpenMRS open source project where I contribute https://wiki.openmrs.org/display/docs/Pull+Request+Tips
  2. Unit Testing Framework – pragmatic usage of unit tests for business rules, and multiple paths through code
  3. Automated building of deployment packages – manual builds are error prone not repeatable
  4. Automated configuration switching between environments
    • external configuration of databases, web service calls etc
    • separation of development and staging environment configurations
  5. CI pipeline – shows status of builds on code commits to the repository, requires unit testing to be in place
  6. Ticketing and Task Tracking – what features are to be built when, and what is their relationship? Also helps track work across sprints as well as communicate to stakeholders
  7. Security – The Open Web Application Security Project (OWASP) top 10 are a minimum standard to be followed
  8. Architecture decisions:
    • Configuration over customization
    • Pragmatic use of external libraries that solve some part of the problem space

Important

May not be in place at project start, but must remain front of mind and put in place when opportunity arises

  1. Coding styles – at project level or even at different layers
  2. Documentation – usually an afterthought, leading to gaps later due to additional pressures. Once project stability is reached, then it is important for different stakeholders. I love Markdown and the excellent GitBook (http://gitbook.com) editor and toolchain
  3. Integration Testing framework – includes UI testing of flows, however is usually brittle so has to be done in a pragmatic manner for critical and complex paths
  4. Automated deployment of builds to staging server – this is a great step as it does not break the developer flows for show cases and demonstrations to stakeholders.
  5. Integration, load, security testing – leave out at your peril as it will come to bite you later. Set some assumptions and test them out to your heart’s content in an automated manner

Myths to be quashed in teams

  1. Developers do not write documentation – it is the responsibility for every member of the team to contribute to the documentation writing and review
  2. Back end, database and front end developers – large projects may provide flexibility for isolation of developers, however it is important for developers to cross cut across the “application layers” to reduce rework and enable evolution as knowledge of the product increases.
  3. Testing is a waste of time – a stitch in time saves nine. Pragmatic testing saves time since it provides more confidence in code reducing stress before showcases and production deployments.
  4. Developers should leave testing to QA staff – testing is multi-layered, so developers should play their part to support testing and quality assurance efforts. QA staff have a different mindset which helps poke holes and find gaps in developed software
  5. All the developers must use the same IDEs – the best tool for a developer is the one they know how to use. If the workflow is IDE specific then the project setup and configuration needs to be looked at to remove this dependency what will constrain the team later
  6. I can build my code faster and better than a framework out there – advice from my mentor “Each problem you are solving is a special case of a more general problem”, “There is no new problem under the sun”. Building new code to solve a special case may be faster today, but you will pay for it in maintenance and evolution

Looking forward to hearing your thoughts and what I may have missed

TechTip: Dbunit Export from Jetbrains DataGrip

I am an avid test driven development (TDD) advocate nowadays, with a pragmatic slant of course, looking to bullet proof the features that I deliver to ensure that they do what is expected, and work out edge cases.

A big challenge to testing is generating of test data, which is needed to setup some integration test work flows. I have been using Jailer (http://jailer.sourceforge.net/) to generate data from existing tables in a Dbunit format which can then embed in my test dataset xml files.

This is a challenge due to the mapping of relationships by Jailer (a neat feature by the way). So while working Datagrip, the database IDE of choice, we were struck by how to export different formats when looking at a table. This solution would allow us to leverage available filtering and searching features, to nail down the datasets that needs to be exported.

On contacting the support team through Twitter (https://twitter.com/0xdbe/status/853900122828222465/photo/1), the recommendation was to modify the existing XML groovy script to generate DBunit XML, following the steps at https://www.jetbrains.com/help/datagrip/2017.1/extending-the-datagrip-functionality.html

And well an hour later below is a groovy script to do just that can be found at https://gist.github.com/ssmusoke/ca4c55b4e52de97acb99a590644a677f

The code was not being well rendered hence the move to a Gist

Building and Maintaining Technical Documentation – Markdown with Gitbook Tooling

Documentation, the word that brings cold sweats to techies far and wide, makes product managers roll their eyes but is the one essential ingredient in aiding adoption and usage of software tools and services.

A key mantra for software development and delivery is “documentation, documentation, documentation” while agile purists will take “Working software over comprehensive documentation” to mean that no documentation.

Obviously in today’s world the expectations for clients, is that software is key and must evolve quickly to meet changing needs measured in weeks, and not months. This fast paced change actually highlights the importance of documentation, but places pressures on it to evolve more rapidly, be easier to use and understand, while maintaining a trail of changes being made within a rapidly changing environment.

Many formats have come and gone over time, plain text,  HTML Help, Windows Compiled HTML Help (CHM), Oracle and Sun Help, Eclipse Help, Flash help not forgetting PDF and MS Word documents for printed manuals. The common practice was to use a single tool to develop the help that compiles into multiple help formats.

Fast forward and the model seems to remain the same, however the challenge being what markup language to use to enable generation. In comes Markdown (https://en.wikipedia.org/wiki/Markdown) which aims for readability (JSON, YAML) so that users do not need to know markup but provides a way of formatting with simple conventions. Interestingly at the time of writing this even Whatsapp one of the most popular chat clients uses markdown like formatting.

The formatting challenge has been solved but now how to build the content, version control it to keep track of updates, generate the content and share it with the world. The most common tools are:

  1. GitHub pages (https://pages.github.com/) – using a special branch within a GitHub based repository to build online based documentation
  2. GitBook (https://www.gitbook.com/) which provides an excellent editor, hosting, build for generating PDF, online documentation, and mobile format (.epub and .mobi). Just open an account, fire up the editor and you are good to go

However if you are using private repositories and need to keep content internal facing, then you need to pay quite a bit for GitBook or jump through multiple hooks. However GitBook fortunately provides a command line client which can be used in this case to build the documentation which is then distributed using internal channels.

The steps to setup a local Gitbook environment are:

  1. Install npm
  2. Install Gitbook cli by typing the command below
    $ npm install gitbook-cli -g
  3. Setup a git repository for the project and add the following files:
    • .gitignore – include the _book directory in which the book will be generated
    • book.json sample below
      {
        "title": "My Book",
        "description": "Description",
        "author": "Author Name",
        "gitbook": ">= 3.0.0",
        "structure": {
          "summary": "SUMMARY.md",
          "readme":"README.md"
        }
      }
    • README.md the default information on the book
    • SUMMARY.md the Table of contents for the book, which may be in different directories recommended is a docs directory
      # Summary
      
      * [Introduction](README.md)
      * [Chapter 1](docs/chapter1.md)
      * [Chapter 2](docs/chapter2.md)
    • package.json – contains build commands for the book
      {
        "scripts": {
          "docs:prepare": "gitbook install",
          "docs:watch": "npm run docs:prepare && gitbook serve"
        }
      }
    • The final project structure looks like

      Screenshot 2017-03-27 11.13.13

      Example gitbook project structure

  4. You can view the book locally by running the command below which starts up a server running on port 4000

    $ npm run docs:watch

  5. Commit the contents to your git repo

I have to say that I love the GitBook Editor which works way better than my IDE, so after commiting the intial files, I fire it up and open the directory containing my project so that I can edit the files from there. Obviously I lose the ability to put high quality comments on what has changed in the files without jumping back into my IDE or git command line, but the sacrifice is currently worth it.

Additional Steps to Generate ePub and PDF

  1. Install Calibre (https://calibre-ebook.com/download) which provides the ebook-convert utility
  2. Add tasks to the package.json as below:
    {
      "scripts": {
        "docs:prepare": "gitbook install",
        "docs:watch": "npm run docs:prepare && gitbook serve",
        "docs:generate-epub" : "gitbook epub ./ ./_book/mybook.epub",
        "docs:generate-pdf" : "gitbook pdf ./ ./_book/mybook.pdf",
        "docs:generate" : "gitbook build && npm run docs:generate-epub && npm run docs:generate-pdf"
      }
    }
  3. Generate epub by running the command

    npm run docs:generate-epub

  4. Generate pdf by running the command

    npm run docs:generate-pdf

  5. Generate both epub and pdf by running

    npm run docs:generate

UPDATE: More information on the GitBook Toolchain  can be found at https://www.gitbook.com/book/gitbookio/docs-toolchain

UPDATE2: Added steps to generate epub and PDF documents

UPDATE3: Discovered that the process has a name – Documentation Driven Development, which is pretty interesting concept … https://twitter.com/brnnbrn/status/847197686042312704/photo/1

c8hzw6qu0aayg2r

 

11 Tools in my 2017 Bag

One month is down, so just wanted to share the tools that I am using in 2017 to get work done. I am writing a lot of software and documentation this year

  1. IntelliJ Idea Ultimate and Datagrip – I have licensed versions of the whole suite of IntelliJ IDEs, but I mainly use Idea for Java, HTML/CSS and Markdown editing. I use DataGrip as my primary GUI for data management.
  2. Simple Note – text based replacement for Evernote which now only works on two devices. It allows me to share notes across my devices, as well as share notes with others either via a web url or email based tagging
  3. Expensify – oh yes need to keep track of thos expenses for reimbursement without having to worry about having to find the original hard copies, that I keep in envelopes tagged by month
  4. Google Calendar – oh yes too many meetings, conference calls across multiple time zones, task list using the Reminder feature. Oh did I say managing across multiple email addresses for work and personal
  5. LinkedIn/Medium – my source of professional news and information that I need to keep track of professionally
  6. Dropbox – never to lose a file or a photo again. In addition I use Box.com with secure encryption for work, and Google Drive when my collaborators do not have any other option.
  7. Iterm 2 – a command line shell with multiple windows and lots of configuration options to replace the default MacOSX terminal
  8. Docker – I am looking forward to increasing my use of this container tech in the year to simplify environment setup and deployment
  9. Homebrew & Caskroom – the missing package managers for MacOS which allow the installation of gui and non-gui software without having to wrangle with paths and configurations
  10. Trello – task manager extra-ordinarie, loving the Powerups that allow integration into Slack, GitHub and other online tools …
  11. RunKeeper – need to keep track of my running activity with a target of 500km in the year

Bonus: Using a wireless keyboard and Apple Trackpad 2 seem to be giving me the aerogonmoic needs I require to fight against carpal-tunnel in my wrist

Tech Tip: Reducing pain while moving from Yahoo to Gmail

Its official that Yahoo has been hacked, http://www.nytimes.com/2016/12/14/technology/yahoo-hack.html?_r=0, and it is time to make that change from Yahoo to another email address. For a free service, looks like Gmail is the best there is at this time.

My quick guide to reducing the migration pain is as follows:

Step 1: Start adding your Gmail to all correspondences and signatures, plus start giving it out instead of yahoo

Step 2: Setup your Gmail to start receiving email from your yahoo address see How to Access Yahoo! Mail in Gmail

Step 3: Respond to all your correspondences via Gmail

While the cut over is immediate, to get your correspondents will take some time to finally start using the new address, probably 3 to 6 months, so be patient

Alternate Approach to Legal Independent Election Tallying

The Uganda elections are more or less over with less than 6 hours for the Uganda Electoral Commission (EC) to announce the results for the presidential elections.

Given all the time on our hands, with no social media, the team at Styx Technology Group designed the following alternative approach to independent electoral vote tallying for future elections that provides inbuilt mechanisms for audit and verification of results.

The primary data sources for the process are:

  1. Official EC list of polling stations and voters per polling station
  2. Photos of the signed election tally sheets from each polling station. To ensure that the photos are not tampered with and provide an audit trail:
    • Each photograph has to be taken with information on the camera, the GPS coordinates of where the photo was taken, date and time when the photo was taken which is available in many cameras that share it using the Exchangeable Image File Format (EXIF)
    • Two separate photos of the tally sheets have to be taken by different cameras
    • The cameras taking equipment may be registered beforehand to provide validation of the source of the information
    • The signatures of the returning officers and stamp must be clear and visible in the photo

The architecture for the technology solution is as follows:

  1. Web based solution accessible via any browser. Due to poor Internet connectivity in many areas of the country, an Android app would be provided to assist in data collection, then data sent once the user gets into an area with Internet.
  2. The field officers who capture the photos would also be provided with an option of entering the candidate vote tallies.
  3. In the tallying center, candidate vote tallies are entered from the photos received and vote tallies entered by data clerks. In order to reduce errors the following approach would be used:
    • The clerks are randomly assigned photos as they come in
    • The tally for a station must be entered correctly by two separate data entry clerks, then approved by a supervisor. This process is formally called the two-pass verification method or double data entry.
  4. All correctly entered data is shared with the rest of the world for download and analysis.

This system is mission-critical having to be available for the entire vote counting period of 48 hours,  so the architecture includes the following paths for data collection:

  1.  Multiple access IP addresses and domains for the website in case some are blocked off
  2. Any data collected via the Android app can be sent via email to a dedicated tallying center address. To ensure that only data from the app is received and not changed in transit, encryption is used.

The inspiration came from a quote by Ghandi “Be the change you wish to see in the world”, disproving the myth that there is no local capability to design and implement such solutions and most of all that such solutions have to be complex.

Looking forward to hearing your thoughts and suggestions…

%d bloggers like this: