Michael Lewis

Chief Technology Officer,Full Stack Developer, System Architect, Programmer, Front-end Designer, Database Maven


If you're looking for a copy and paste monkey programmer I'm not your guy. However ...

If you're looking for someone who has pretty much done it all, produces quality solutions insanely fast — then keep reading.

I started my career designing and writing operating systems, programming languages and database systems. As new technologies appeared on the scene I learned and applied the best of them to my work. I am fluent in many programming languages. I am a workaholic (typically putting in 90-105 hours per week these days).

I built this web site using Bootstrap as the base with PHP & Javascript for the dynamic aspects.

Basically I love what I do and according to the old adage I haven't worked a day in my life.

What I do


I can design simple or complex systems. Everything from full SaaS platforms to eCommerce.

It's a Mobile World

Every web site I do now and have done for the last 9 years has used liquid design concepts (what we're calling today: responsive). I made the decision after having an iPhone for six months and saw that mobile was the wave I wanted to be on top of.

Play Well with Others

I work well in a team environment. Over the last 12 years I have usually been leading the team but I can mesh my skills with others in a very collaborative way. I have almost always learned something from everyone I have worked with and hope they've learned something from me.


I can code in many different languages and am proficient in many differents techniques such as multi-threaded parallel programming.

Full Stack Developer

In other words, I can take a concept and produce a working solution for all aspects of the project. I have knowledge and proficiency in all aspects of project implementation and management and use that knowledge to quickly create solutions.


I love to learn. I love new things. The Raspberry PI came out and I bought one immediately and spent the weekend figuring out how to get it to unlock my front door when I showed up with my iPhone in my pocket. I've created a number of useful little projects using the PI such as a passive DNS sensor.


I am the CTO of iThreat Cyber Group, Inc (ICG) — a working CTO that is. ICG is an internet investigation firm at its heart. (See our web site for more information. I have been with ICG since 2003. While there I have undertaken many complex projects as well as oversee the development team. Before ICG I worked full time as a hired gun for over 10 years. Since our most expensive assets are our analysts my mantra is to figure out how to make them more effective. How can I swap machine cycles for human effort which improves our bottom line and helps us to arrive at answers faster and in more depth than the next guy.

I have a ton of experience doing just about everything. I am listing only the highlights of projects which cover pretty much the entire range of what I do. I go into some detail when I can, however, I am limited by NDAs in many cases in what I can say.

  • Work Experience and Significant Projects

  • cybertoolbelt.com 2.x


    Redesigned and reimagined cybertoolbelt.com and the client-facing RESTful API. The redesign of the GUI was for two primary reasons: Make it nicer looking and improve workflow. Out of this came additional capabilities and improved performance. The API was rewritten for performance and expansion capabilities.

    The rewrite of the front end allowed be to develop a method of truly multitasking javascript without web workers or other hacks.

    Some More Detail

    CyberTOOLBELT (CTB) is a Single Page Application (SPA). Bootstrap is used for the responsive layout/presentation control. I designed and implemented an InterProcess Communcation (IPC) system for the remote servers, local servers and processes to communicate with each other and the user.

    The IPC is at the heart of the multitasking capabilites. For example, when you do a domain lookup from the platform, this actually results in 8 calls to the backend. The javascript front end makes 8 calls to the backend without waiting on any individual call. The calls are processed concurrently on the backend and the front end display is updated as each call finishes. Calls can take anywhere from tens of milliseconds to over a minute to complete depending on the quantity of data to be returned. Yet the user on the front end can switch to another tool and proceed without waiting for prior tasks to finish.

  • CyberINTEL.report


    I designed and implemented much of this intelligence platform. It leverages the CyberTOOLBELT backend to produce comprehensive reports about domains, IP addresses, email addresses and Whois datamining. We call it our "Analyst-in-a-box" product. The thinking is that we would distill how our internet analysts investigate these items and attempt to make it possible for a person without that expertise to have an investigation done without spending hundreds of dollars in a time and materials investigation.

    Some More Detail

    The system produces reports primarily as PDF files. The other report format is Excel-compatible spreadsheets (not .csv files). The reports range in price from $4.95 — $29.95. This pricing is designed to compete directly with Time and Material shops (such as our own). The minimum internet investigation of a domain by a T&M shop varies from $100 — $350 (for an hour's worth of work).

    The reports generated are not simply a dump of data from our extensive database. When possible the software attempts to point to interesting things in the data.

  • findstuff4.us


    Ignoring the stupid name (stupid because it has nothing to do with what the site does, it was a domain I had lying around — we'll get a better one later), this is kind of a cool little tool to pretty format MySQL .FRM files. It's something I dabble in every so ofter as I need new features or improve existing ones. It's one of those neat little tools that you right for yourself that may be useful for other developers also. Please send me feedback if you run into any problems or have suggestions.

  • ithreatfusion.center


    iThreat fusion center (iFC) came about because one of our larger clients needed to manage the insane amount of data that was arriving in their company by an incredible amount of sources (they are a Fortune 100 company). This data arrives by email, RSS feed, API, Microsoft Word docs, PDFs, human analysts, inteligence/alerting services such as NC4, Stratfor, and others.

    This project involves conversion of PDF and .DOCX files for indexing. Also dealing with parsing emails to separate out the information needed for searching as well as intelligence summaries.

    Beyond a heatmap of activity we have another map which displays their travelers (they average 12,000+ people a day on the road) as well as there 5,000+ facilities. So if an incident occurs they can quickly determine if they have people in the area (including on planes and trains) and contact them using the system.

    Way More Detail

    One of the more interesting things involved in the project is that two ways to search the library had to be implemented. One I call the "old guy" way. This is for users who are more comfortable searching the library as they would a regular library. First find the source. Then peruse the titles until you find what you are looking for.

    The other way to search is using a search engine (in this case, Elastic Search). I didn't want to have a standard and advanced search. Therefore I allow free-form text input to the search front end. It then gets compiled down by my parser into something that makes sense to Elastic Search. Consider the somewhat absurd query:

    "this is a phrase" and date on 2/3/2014 date on or before 2015-03-29 and (date>=3/18/2016 body contains lewis not title contains michael) cybertoolbelt.com country only 'united states' and source is pinkerton

    The first step in the compilation process is validation and producing a tokenized string for the final step. In that case the tokenized string is:

    "this is a phrase" AND date = 2014-02-03 OR date <= 2015-03-29 AND (date >= 2016-03-18 OR body contains lewis NOT title contains michael) OR cybertoolbelt.com OR country only united states AND source = pinkerton

    Note that the input could be as simple as michael. The parser supports AND, OR, NOT and NEAR as operators for terms. Phrase searching. "Only" searching which may be unfamiliar to people by that name (I made up the operator). Basically documents in the search database can be tagged to multiple countries or have multiple other keywords. "Only" specifies that the specified tag (in this case the country name) can be the only tag associated with the document for the document to be selected.

    There is also extensive date operation handling. For example you could enter michael since monday or yemen since last month or graduation on or before 4/1/2013 and graduation<=3 weeks ago. Just about any way you can specify a date is handled.

  • cybertoolbelt.com


    I came up with the idea of creating an investigative platform for internet investigators. I noticed that the analysts at ICG were using a number of different tools and copying and pasting the results into a spreadsheet. There has to be a better way and I think CyberTOOLBELT is that way.

    Way More Detail

    CyberTOOLBELT (CTB) is comprised of a number of subsystems and deals with big data. The system has many billions of records related to domains, IP addresses and other items.

    The design philosophy I took was to make the system easy to use for users who are not necessarily strong in their computer skills and yet not get in the way of skilled users.

    The tool design philosophy is basically, make the tool the best version of that tool it can be. An example would be the Traceroute tool. You can run traceroute from the command line and there are about a zillion traceroute tools on the web. What I did was design and implement a more useful tool. The CTB traceroute tool allows you to examine an IP address or a domain from up to eight different starting points around the world at the same time. Anyone who has used traceroute can immediately see the advantages of this approach.

    Users see the web site GUI and interact with the data using that interface. There is also a REST API built that allows bulk users to access certain exposed aspects of our data from their own servers.

    Underneath it all is an internal API which communicates with a multi-process C++ program (and actually a number of sub-programs) that services both the GUI and the REST API. This API has over 100 commands.

    Some of the sub-programs are multi-threaded C++ programs that use parallel programming to tackle a number of the "big data" issues (I love buzzwords). As buzzy as that sentence was it's true. We can fire up to 256 threads to deal with a specific problem.

    I've developed new techniques to handle dealing with hundreds of millions of records. For example, the domain name database has over 348,000,000 domain names as of this writing. I have developed a technique that allows the searching of all 348+M domains for any substring in under 4 seconds. By way of example, the GREP program takes over 20 minutes to search for a substring in the .COM zone file (about 125 million domains). I actually have a way to cut the search speed by 80% but I'm holding off until search times get in the 5 second range.

    Let's say you perform the search described above for all the domains that contain "pixel" somewhere in the domain name. You get back a bunch of results. You can graphically display those results using a force-directed graph that shows the relationships between underlying data of the matching domains (such as common infrastructure). We use D3.js to handle the graphing.

    We have over 200,000,000 million Whois records relating to domains. We have both a simple and advanced search that allows you to mine that data. We use Elastic Search for the backend.

    CTB's landing page does a good job of describing the tools and capabilities. I just wanted to give you an idea here of some the techniques and tools used under the hood.

  • smmonitor.com

    smmonitor.com is a credentialed site that we created for our investigations that allow us to do social media monitoring. We are pioneers in social media monitoring from 2006 when we created our first tools. The majority of the work on this project was completed in 2014. I have a patent pending in regards to this product.

    smmonitor.com represents our newest and greatest effort in social media monitoring and we are considering releasing it as a product for other investigators. Should we do so the only work that needs to be done is a pretty landing page.

  • ithreat.com

    iThreat is the premier investigative platform. It is comprised of a number of tools and advanced functionality. Because of NDA restrictions I cannot divulge a lot of details. However, it is responsible for supporting the majority of a multi-million annual revenue business.

  • Various Projects

    There are a number of different projects I've done over the years from a previous resume incarnation that I've included here.

    Developed a native APP for iOS that was a situational awareness application. Using the iThreat platform we could alert users to possible danger and trouble near them. The app also included a lot of information for travelers (things like the CIA World Fact book).

    Developed a poker league web site (pokerleague4.us that does all the maintenance and user interaction for a bar poker league.

    Developed and oversaw the development of a lobal intelligence, global trade and global piracy monitors. Highly interactive and seriously cool (www.ithreat.com — although the cool stuff is behind credentials). This is the end result of many man years of development of the intelligence platform.

    Developed a site that allows high-school coaches to report game results using a web interface (saves massive time and greatly improves accuracy). Heavy AJAX programming. (scores.xelent.net).

    Developed a number of webbots/spiders to access different types of information such as RSS feeds, forum sites, etc.

    Developed an intranet for the company with a AJAX-based expense report entry, maintenance, printing and approval system. PDF versions of the expense report are produced. Bookkeeping is able to download the information from within QuickBooks as all the chart of accounts information for expenses is kept in the expense report database. User administration, expense type, chart of accounts and other administrative functions are all built into the application.

    Developed a full text search and storage network appliance named ftEngine. This is a scalable solution to index and search huge amounts of text in very rapid times. The software is written primarily in C++ with some PERL used as a daemon to assist the functionality of the administration web interface (which is written in PHP). The appliance is administered through a web-based front end.

    Developed an athletics web site for various high school with full administration capability. Features variable number of sports, levels of play and genders. Includes home-grown message board, schedules, roster maintenance, directions, web page text, statistics tracking, etc. Game results are reported through the system and anyone can sign up to recieve results by email with option pictures included. There is a photo gallery as well as fund raising support. There is full statistical capture for basketball teams with real-time web updates. The client, a local laptop for instance, runs a program I wrote in VB.NET (2005 edition) that captures statistical and team information during the game. There is an option to post all updates in real-time to the school's web site. On the web server you can display the current game's progress in real-time. Users accessing the web site can bring up the game page and see players actions, statistics presented in a table format with color highlighting, all without refreshing the page (Sometimes I just love AJAX). Show Me..

    Developed a web front end that manages a web crawler that then indexes the spidered site parsing out useful information and indexing it using the aforementioned ftEngine appliance. You can search the spidered site by keyword or by parsed information (such as all the telephone numbers contained on the site).

    Developed a couple of eCommerce sites for a companies with custom, home-grown shopping carts and online payment. I also implentmented online ordering for a restaurant.

Skills (as of 1/2018)


PHP - Expert, 17 years
Javascript - Expert, 17 years
C/C++ - Advanced, 20+ years
MySQL - Expert, 19+ years
Perl - Medium, 13 years
VB.NET - Advanced, 16 years
Various Assembly Languages - Expert, 20+ years
Swift - Advanced, 5 years

Advanced Knowledge

  • NodeJS
  • AJAX
  • D3js
  • Elastic Search
  • HTML 5
  • CSS
  • Bootstrap
  • jQuery
  • jQueryMobile
  • Apache
  • Linux
  • MacOS
  • iOS
  • Python
  • Cordova (PhoneGap)
  • Numerous APIs
  • Big Data
  • GitHub
  • Sockets (including web sockets)
  • OOP
  • HAPI


and more years experience


projects delivered


patents (including pending)

Code, Etc.

Here you will find some examples of things I've done or code samples. It is not meant to be an in-depth display of my coding skills. Just little samples to prove that I know the difference between "=", "==", and "===".

I'll add to this section as I get time and can carve out more examples.

JavaScript Sample

This example covers the handling of form submissions. I used it on CTB (under jQueryMobile). I wanted a centralized form handler.

The callBack handler is called after the input parameters have been verified by the second switch statement.

$('form').submit(function(event) {
    var thisForm=$(this);
    var formUrl=thisForm.attr('action');
    var formName=thisForm.attr('name');
    var dataToSend=thisForm.serialize();
    var callBack=function(dataReceived) {
        switch(formName) {
           case "awForm":
	    case "dvForm":


    var rtnType="html";
    switch(formName) {
	case "awForm":
  	    var tmp=trim($('#awSearchTerms').val());
	    if (tmp.length==0) {
		putError("sw","You must specify a search term.");
		return false;
	case "dvForm":

    return false;

Fun With Javascript and CSS

I had to refactor a menu screen from about 9 years ago and decided to have some fun with buttons. I came up with a fairly simple solution using CSS and minimal Javascript.

The first block is the CSS used.

.menuButtonNS {
border-radius:6px 6px 6px 6px;
.menuButton {
border-radius:6px 6px 6px 6px;
box-shadow: 10px 10px 5px #888888;

The Javascript is even simpler.

function depressButton(e) {
function releaseButton() {

And finally the HTML.

<section id="adminMainPage" onmouseup="releaseButton();">
<div id="class="container" style="margin-top:50px">

  <div class="row">
    <div class="col-sm-1">
       <a onclick="alert('Clicked Reports Button');" title="Click to run reports" class=active onmousedown="depressButton('#but1');">
       <div id=but1 class="menuButton"><i class="fa fa-bar-chart"></i></div></a>
    <div class="col-sm-4">
	<h1 class=hdr><a onclick="alert('Clicked Reports Button" title="Click to run reports" class=active>Reports</a></h1>
    <div class="col-sm-1">
       <a onclick="alert('Clicked Tools Button');" title="Click to use the tools" class=active onmousedown="depressButton('#but2');">
	<div id=but2 class="menuButton"><i class="fa fa-wrench"></i></div></a>
    <div class="col-sm-4">
	<h1 class=hdr><a onclick="alert('Clicked Tools Button');" title="Click to use the tools" class=active>Tools</a></h1>
    <div class="col-sm-3">
  <div class="row">
    <div class="col-sm-1">
       <a onclick="alert('Clicked Users Button');" title="Click to maintain user records" class=active onmousedown="depressButton('#but3');">
       <div id=but3 class="menuButton"><i class="fa fa-user"></i></div></a>
    <div class="col-sm-4">
	<h1 class=hdr><a onclick="alert('Clicked Users Button" title="Click to maintain user records" class=active>Users</a></h1>
    <div class="col-sm-1">
       <a onclick="alert('Clicked but4');" title="Click to use the archives" class=active onmousedown="depressButton('#but4');">
	<div id=but4 class="menuButton"><i class="fa fa-archive"></i></div></a>
    <div class="col-sm-4">
	<h1 class=hdr><a onclick="alert('Clicked Archives Button');" title="Click to use the archives" class=active>Archives</a></h1>
    <div class="col-sm-3">



Captcha Example

I grew to dislike the majority of captchas out there and decided a number of years ago to come up with one that was reasonably easy to read. Not defeated by OCR. And really required a person on the other end.

Click Here to regenerate the image.

(5 times maximum)


Contact Me I am available at the below email. Please do not contact me at ICG for non-ICG issues.

  • Stroudsburg, PA | Located in the Pocono Mountains of Northeast PA
  • mlewis@xelent.net