This site was developed to showcase some of my abilities for independent work. If you're looking for a copy and paste monkey who lives in Stack Overflow — I'm not your guy. However ...
I am a Whois and DNS subject matter expert. I have spent the last 21 years writing tools to aid in investigations of OSINT as well as analyze that data. I was the primary architect of the DAAR system developed for ICANN for domain abuse analysis. Most of these tools are gathered in the cybertoolbelt.com platform.
If you're looking for someone who has pretty much done it all, produces quality solutions insanely fast — then keep reading.
I started my career designing and writing operating systems, programming languages and database systems. As new technologies appeared on the scene I learned and applied the best of them to my work. I am fluent in many programming languages. I am a workaholic (typically putting in 90-105 hours per week up until the pandemic). I have since slowed to 40-55 hours per week.
I built this web site using Bootstrap as the base with PHP & Javascript for the dynamic aspects.
Basically I love what I do and according to the old adage I haven't worked a day in my life.
I can design simple or complex systems. Everything from full SaaS platforms to eCommerce.
Every web site I do now and have done for the last 9 years has used liquid design concepts (what we're calling today: responsive). I made the decision after having an iPhone for six months and saw that mobile was the wave I wanted to be on top of.
I work well in a team environment. Over the last 12 years I have usually been leading the team but I can mesh my skills with others in a very collaborative way. I have almost always learned something from everyone I have worked with and hope they've learned something from me.
I can code in many different languages and am proficient in many differents techniques such as multi-threaded parallel programming.
In other words, I can take a concept and produce a working solution for all aspects of the project. I have knowledge and proficiency in all aspects of project implementation and management and use that knowledge to quickly create solutions.
I love to learn. I love new things. The Raspberry PI came out and I bought one immediately and spent the weekend figuring out how to get it to unlock my front door when I showed up with my iPhone in my pocket. I've created a number of useful little projects using the PI such as a passive DNS sensor.
Currently I am the CEO of CyberTOOLBELT®. CTB is an SAAS investigative platform. I am a working CEO. CTB is a spun-off company from iThreat Cyber Group (ICG). I was the CTO at ICG with a team of developers. ICG was an internet investigation firm at its heart. Basically time and materials work. When I started there I saw many ways to automate the methods that were used in investigations. This led to the deveiopment of CTB.
The results were dramatic. An investigation that could take 8 hours using free and paid intenet tools could be reduced to 45 minutes to an hour. Our margins went way up. Along the way I developed a very unique and deep understanding of Whois, DNS, abuse of the DNS among other skills. This allowed me to design and implement tools and systems to further force-multiply our analysts efforts in-house and then the larger investigative community at large.
In January of 2022 iThreat was sold off in pieces: the investigations unit, CTB and CleanDNS. I purchased CTB and, with a few of my team, started CTB as an independent company.
I had been with ICG since 2003. While there I have undertaken many complex projects as well as oversee the development team. Before ICG I worked full time as a hired gun for over 10 years. Since our most expensive assets are our analysts my mantra wass to figure out how to make them more effective. How can I swap machine cycles for human effort which improves our bottom line and helps us to arrive at answers faster and in more depth than the next guy.
I have a ton of experience doing just about everything. I am listing only the highlights of projects which cover pretty much the entire range of what I do. I go into some detail when I can, however, I am limited by NDAs in many cases in what I can say.
I have developed data-access techniques based on the data type. For example, I developed a unique data access system for our raw Whois information that allows for sub-millisecond access to any of our large database of raw whois information. In the proces, shrinking a multi-terabyte raw whois MySQL database to 1/10th the size.
The TL;DR is that I love designing things and creating efficiencies.
I apologize in advance for any typos or grammatical errors. I composed this site using Nano at the Linux command line. It's spell checker is the worst.
Designed and developed a passwordless authentication package and web site 2fa.solutions I designed and implmented the web site and backend API as well as colloborated on the iOS/Android App.
I purchased the CyberTOOLBELT assets and hired some of the developers who were working under me at iThreat when I was CTO. Our goal has been to continue further expanding the platforms ability to investigate internet-related issues. Clients such as NCA, Interpol, Disney, HMRC and ICANN are all or have used our platform for years.
We are currently developing a log-in authenticator mobile app (www.2fa.solutions) that allows companies to easily implement password-less signings to their platforms using bio-metrics. The system is planned to go into beta-testing in February. The following is a look at the CTB implementation:
We have also developed a CTB-specific mobile App to both handle password-less logins to CTB as well as provide search capabilities and the ability to submit and receive reports.
The following screenshots are from an early version of the App. We are currently concentrating on utility over looks. Pretty will come later.
We have now developed and integrated over 60 tools into CTB and incorporate many diverse datasets. We provide monitoring of various aspects of the DNS, web pages and abuse.
We have simplified our dashboard so that you can enter a data item and the system is very good at figuring out what it is and running a number of automated searches for you.
The 4.x series of CyberTOOLBELT was major revamp of the entire system. Fron a more intiutive interface to about 75% rewrite of the underlying structure. I created a number of micro-services in GO that isolated and sped up data access tremendously. It also had the side effect of isolating various logical components of the system from each other to make the system more resilent. Entire sub-systems such as Whois processing, abuse processing, and DNS subsystems were handled by the various microservices.
The internal reconfiguration also allowed us to query disparate data sources (structured, unstructured, SQL, no-SQL, etc) as basically one entity. This very much resembles what's become know as a data lake concept.
While at ICG I designed and was the architect of the ICANN DAAR project which monitors and reports on TLD and registrar abuse.
The 3.0 release of CyberTOOLBELT brings with it an intelligent assistant named Max. Max is designed to help the analyst's workflow. Max used machine-learning based techniques to communicate and adapt to the analyst's needs. For example, the analyst can ask Max to lookup a domain such as cybertoolbelt.coma and then ask "what IPs are realted". Max retains context and adjusts to the user's method of communication. Many additional capabilites are planned for Max.
We have also refactored the GUI frontend to clean up, remove, enhance and spruce up everything we could. The goal was to make it faster loading and provide new functionality.
Some new tools were added such as the ability to retrieve a twitter network mapping of N twitter handles. Many existing tools got new features and export capabilities.
CyberTOOLBELT deals with many terabytes of data and imports/generates gigabytes per day. This data is shared between the API and many users around the world.
Redesigned and reimagined cybertoolbelt.com and the client-facing RESTful API. The redesign of the GUI was for two primary reasons: Make it nicer looking and improve workflow. Out of this came additional capabilities and improved performance. The API was rewritten for performance and expansion capabilities.
The rewrite of the front end allowed be to develop a method of truly multitasking javascript without web workers or other hacks.
CyberTOOLBELT (CTB) is a Single Page Application (SPA). Bootstrap is used for the responsive layout/presentation control. I designed and implemented an InterProcess Communcation (IPC) system for the remote servers, local servers and processes to communicate with each other and the user.
The IPC is at the heart of the multitasking capabilites. For example, when you do a domain lookup from the platform, this actually results in 8 calls to the backend. The javascript front end makes 8 calls to the backend without waiting on any individual call. The calls are processed concurrently on the backend and the front end display is updated as each call finishes. Calls can take anywhere from tens of milliseconds to over a minute to complete depending on the quantity of data to be returned. Yet the user on the front end can switch to another tool and proceed without waiting for prior tasks to finish.
I designed and implemented much of this intelligence platform. It leverages the CyberTOOLBELT backend to produce comprehensive reports about domains, IP addresses, email addresses and Whois datamining. We call it our "Analyst-in-a-box" product. The thinking is that we would distill how our internet analysts investigate these items and attempt to make it possible for a person without that expertise to have an investigation done without spending hundreds of dollars in a time and materials investigation.
The system produces reports primarily as PDF files. The other report format is Excel-compatible spreadsheets (not .csv files). The reports range in price from $4.95 — $29.95. This pricing is designed to compete directly with Time and Material shops (such as our own). The minimum internet investigation of a domain by a T&M shop varies from $100 — $350 (for an hour's worth of work).
The reports generated are not simply a dump of data from our extensive database. When possible the software attempts to point to interesting things in the data.
*** I have taken this site down for a redesign. **
Ignoring the stupid name (stupid because it has nothing to do with what the site does, it was a domain I had lying around — we'll get a better one later), this is kind of a cool little tool to pretty format MySQL .FRM files. It's something I dabble in every so often as I need new features or improve
existing ones. It's one of those neat little tools that you right for yourself that may be useful for other developers also. Please send me feedback if you run into any problems or have suggestions.
iThreat fusion center (iFC) came about because one of our larger clients needed to manage the insane amount of data that was arriving in their company by an incredible amount of sources (they are a Fortune 100 company). This data arrives by email, RSS feed, API, Microsoft Word docs, PDFs, human analysts, inteligence/alerting services such as NC4, Stratfor, and others.
This project involves conversion of PDF and .DOCX files for indexing. Also dealing with parsing emails to separate out the information needed for searching as well as intelligence summaries.
Beyond a heatmap of activity we have another map which displays their travelers (they average 12,000+ people a day on the road) as well as there 5,000+ facilities. So if an incident occurs they can quickly determine if they have people in the area (including on planes and trains) and contact them using the system.
One of the more interesting things involved in the project is that two ways to search the library had to be implemented. One I call the "old guy" way. This is for users who are more comfortable searching the library as they would a regular library. First find the source. Then peruse the titles until you find what you are looking for.
The other way to search is using a search engine (in this case, Elastic Search). I didn't want to have a standard and advanced search. Therefore I allow free-form text input to the search front end. It then gets compiled down by my parser into something that makes sense to Elastic Search. Consider the somewhat absurd query:
"this is a phrase" and date on 2/3/2014 date on or before 2015-03-29 and (date>=3/18/2016 body contains lewis not title contains michael) cybertoolbelt.com country only 'united states' and source is pinkerton
The first step in the compilation process is validation and producing a tokenized string for the final step. In that case the tokenized string is:
"this is a phrase" AND date = 2014-02-03 OR date <= 2015-03-29 AND (date >= 2016-03-18 OR body contains lewis NOT title contains michael) OR cybertoolbelt.com OR country only united states AND source = pinkerton
Note that the input could be as simple as michael. The parser supports AND, OR, NOT and NEAR as operators for terms. Phrase searching. "Only" searching which may be unfamiliar to people by that name (I made up the operator). Basically documents in the search database can be tagged to multiple countries or have multiple other keywords. "Only" specifies that the specified tag (in this case the country name) can be the only tag associated with the document for the document to be selected.
There is also extensive date operation handling. For example you could enter michael since monday or yemen since last month or graduation on or before 4/1/2013 and graduation<=3 weeks ago. Just about any way you can specify a date is handled.
I came up with the idea of creating an investigative platform for internet investigators. I noticed that the analysts at ICG were using a number of different tools and copying and pasting the results into a spreadsheet. There has to be a better way and I think CyberTOOLBELT is that way.
CyberTOOLBELT (CTB) is comprised of a number of subsystems and deals with big data. The system has many billions of records related to domains, IP addresses and other items.
The design philosophy I took was to make the system easy to use for users who are not necessarily strong in their computer skills and yet not get in the way of skilled users.
The tool design philosophy is basically, make the tool the best version of that tool it can be. An example would be the Traceroute tool. You can run traceroute from the command line and there are about a zillion traceroute tools on the web. What I did was design and implement a more useful tool. The CTB traceroute tool allows you to examine an IP address or a domain from up to eight different starting points around the world at the same time. Anyone who has used traceroute can immediately see the advantages of this approach.
Users see the web site GUI and interact with the data using that interface. There is also a REST API built that allows bulk users to access certain exposed aspects of our data from their own servers.
Underneath it all is an internal API which communicates with a multi-process C++ program (and actually a number of sub-programs) that services both the GUI and the REST API. This API has over 100 commands.
Some of the sub-programs are multi-threaded C++ programs that use parallel programming to tackle a number of the "big data" issues (I love buzzwords). As buzzy as that sentence was it's true. We can fire up to 256 threads to deal with a specific problem.
I've developed new techniques to handle dealing with hundreds of millions of records. For example, the domain name database has over 348,000,000 domain names as of this writing. I have developed a technique that allows the searching of all 348+M domains for any substring in under 4 seconds. By way of example, the GREP program takes over 20 minutes to search for a substring in the .COM zone file (about 125 million domains). I actually have a way to cut the search speed by 80% but I'm holding off until search times get in the 5 second range.
Let's say you perform the search described above for all the domains that contain "pixel" somewhere in the domain name. You get back a bunch of results. You can graphically display those results using a force-directed graph that shows the relationships between underlying data of the matching domains (such as common infrastructure). We use D3.js to handle the graphing.
We have over 200,000,000 million Whois records relating to domains. We have both a simple and advanced search that allows you to mine that data. We use Elastic Search for the backend.
CTB's landing page does a good job of describing the tools and capabilities. I just wanted to give you an idea here of some the techniques and tools used under the hood.
smmonitor.com is a credentialed site that we created for our investigations that allow us to do social media monitoring. We are pioneers in social media monitoring from 2006 when we created our first tools. The majority of the work on this project was completed in 2014. I have a patent pending in regards to this product.
smmonitor.com represents our newest and greatest effort in social media monitoring and we are considering releasing it as a product for other investigators. Should we do so the only work that needs to be done is a pretty landing page.
iThreat is the premier investigative platform. It is comprised of a number of tools and advanced functionality. Because of NDA restrictions I cannot divulge a lot of details. However, it is responsible for supporting the majority of a multi-million annual revenue business.
There are a number of different projects I've done over the years from a previous resume incarnation that I've included here.
Developed a native APP for iOS that was a situational awareness application. Using the iThreat platform we could alert users to possible danger and trouble near them. The app also included a lot of information for travelers (things like the CIA World Fact book).
Developed a poker league web site (pokerleague4.us that does all the maintenance and user interaction for a bar poker league.
Developed and oversaw the development of a global intelligence, global trade and global piracy monitors. Highly interactive and seriously cool (www.ithreat.com — although the cool stuff is behind credentials). This is the end result of many man years of development of the intelligence platform.
Developed a site that allows high-school coaches to report game results using a web interface (saves massive time and greatly improves accuracy). Heavy AJAX programming. (scores.xelent.net).
Developed a number of webbots/spiders to access different types of information such as RSS feeds, forum sites, etc.
Developed an intranet for the company with a AJAX-based expense report entry, maintenance, printing and approval system. PDF versions of the expense report are produced. Bookkeeping is able to download the information from within QuickBooks as all the chart of accounts information for expenses is kept in the expense report database. User administration, expense type, chart of accounts and other administrative functions are all built into the application.
Developed a full text search and storage network appliance named ftEngine. This is a scalable solution to index and search huge amounts of text in very rapid times. The software is written primarily in C++ with some PERL used as a daemon to assist the functionality of the administration web interface (which is written in PHP). The appliance is administered through a web-based front end.
Developed an athletics web site for various high school with full administration capability. Features variable number of sports, levels of play and genders. Includes home-grown message board, schedules, roster maintenance, directions, web page text, statistics tracking, etc. Game results are reported through the system and anyone can sign up to recieve results by email with option pictures included. There is a photo gallery as well as fund raising support. There is full statistical capture for basketball teams with real-time web updates. The client, a local laptop for instance, runs a program I wrote in VB.NET (2005 edition) that captures statistical and team information during the game. There is an option to post all updates in real-time to the school's web site. On the web server you can display the current game's progress in real-time. Users accessing the web site can bring up the game page and see players actions, statistics presented in a table format with color highlighting, all without refreshing the page (Sometimes I just love AJAX). Show Me..
Developed a web front end that manages a web crawler that then indexes the spidered site parsing out useful information and indexing it using the aforementioned ftEngine appliance. You can search the spidered site by keyword or by parsed information (such as all the telephone numbers contained on the site).
Developed a couple of eCommerce sites for a companies with custom, home-grown shopping carts and online payment. I also implentmented online ordering for a restaurant.
and more years experience
projects delivered
patents
Here you will find some examples of things I've done or code samples. It is not meant to be an in-depth display of my coding skills. Just little samples to prove that I know the difference between "=", "==", and "===".
Obviously I can't post anything that was developed for someone else. An example is a front-end processor to an Elastic Search cluster that converted a somewhat human-like query to the Whois database so that input such as:
name contains lewis and state is nj or state=pa and registered on or after 8/4/2016 and organization not "icg,inc"
would return relevant results to the user without them having Elastic Search search knowledge. The parser and interface to Elastic Search were primarily written in C++ and some PHP.
Note that the following are rather simple examples. Especially since I used to design and implement operating systems, languages and database software.
This is a simple garbage collection Go routine I use to garbage collect expired requests in the API for 2FA Solutions. It runs in the background while the main routines process API requests.
go func () { for { time.Sleep(900 * time.Second) db,err := DB("master") if (err!=nil) { fmt.Println("could not make DATABASE connection in delete old code requests loop",err) continue } defer db.Close() s := `delete from code_request where msg_type='validate' && expires+600<=UNIX_TIMESTAMP()` _,err=db.Exec(s) if (err!=nil) { fmt.Println("attempting to delete old validate code_requests",err) } s = `delete from code_request where msg_type='validateemail' && expires+1200<=UNIX_TIMESTAMP()` _,err=db.Exec(s) if (err!=nil) { fmt.Println("attempting to delete old validateemail code_requests",err) } db.Close() } } ()
This example covers the handling of form submissions. I used it on CTB (under jQueryMobile). I wanted a centralized form handler.
The callBack handler is called after the input parameters have been verified by the second switch statement.
$('form').submit(function(event) { var thisForm=$(this); var formUrl=thisForm.attr('action'); var formName=thisForm.attr('name'); event.preventDefault(); var dataToSend=thisForm.serialize(); var callBack=function(dataReceived) { dataReceived=trim(dataReceived); switch(formName) { case "awForm": $('#advWhois').dialog("close"); $('#whoisSearchResults').html(dataReceived).trigger('create'); break; case "dvForm": ... } }; var rtnType="html"; switch(formName) { case "awForm": var tmp=trim($('#awSearchTerms').val()); if (tmp.length==0) { putError("sw","You must specify a search term."); return false; } rtnType="text"; break; case "dvForm": ... } $.post(formUrl,dataToSend,callBack,rtnType); return false; });
I had to refactor a menu screen from about 9 years ago and decided to have some fun with buttons. I came up with a fairly simple solution using CSS and minimal Javascript.
The first block is the CSS used.
<style> .hdr{margin-left:20px} .menuButtonNS { margin-top:12px; margin-left:15px; left:10px; top:10px; width:50px; height:54px; background-color:#808080; color:white; text-align:center; padding:8px; font-size:30px; border-radius:6px 6px 6px 6px; margin-bottom:40px; } .menuButton { margin-top:12px; margin-left:15px; width:50px; height:48px; background-color:#404040; color:white; text-align:center; padding:4px; font-size:30px; border-radius:6px 6px 6px 6px; margin-bottom:20px; box-shadow: 10px 10px 5px #888888; } </style>
The Javascript is even simpler.
<script> function depressButton(e) { $(e).removeClass("menuButton"); $(e).addClass("menuButtonNS"); } function releaseButton() { $('.menuButtonNS').addClass("menuButton"); $('.menuButtonNS').removeClass("menuButtonNS"); } </script>
And finally the HTML.
<section id="adminMainPage" onmouseup="releaseButton();"> <div id="class="container" style="margin-top:50px"> <div class="row"> <div class="col-sm-1"> <a onclick="alert('Clicked Reports Button');" title="Click to run reports" class=active onmousedown="depressButton('#but1');"> <div id=but1 class="menuButton"><i class="fa fa-bar-chart"></i></div></a> </div> <div class="col-sm-4"> <h1 class=hdr><a onclick="alert('Clicked Reports Button" title="Click to run reports" class=active>Reports</a></h1> </div> <div class="col-sm-1"> <a onclick="alert('Clicked Tools Button');" title="Click to use the tools" class=active onmousedown="depressButton('#but2');"> <div id=but2 class="menuButton"><i class="fa fa-wrench"></i></div></a> </div> <div class="col-sm-4"> <h1 class=hdr><a onclick="alert('Clicked Tools Button');" title="Click to use the tools" class=active>Tools</a></h1> </div> <div class="col-sm-3"> </div> </div> <div class="row"> <div class="col-sm-1"> <a onclick="alert('Clicked Users Button');" title="Click to maintain user records" class=active onmousedown="depressButton('#but3');"> <div id=but3 class="menuButton"><i class="fa fa-user"></i></div></a> </div> <div class="col-sm-4"> <h1 class=hdr><a onclick="alert('Clicked Users Button" title="Click to maintain user records" class=active>Users</a></h1> </div> <div class="col-sm-1"> <a onclick="alert('Clicked but4');" title="Click to use the archives" class=active onmousedown="depressButton('#but4');"> <div id=but4 class="menuButton"><i class="fa fa-archive"></i></div></a> </div> <div class="col-sm-4"> <h1 class=hdr><a onclick="alert('Clicked Archives Button');" title="Click to use the archives" class=active>Archives</a></h1> </div> <div class="col-sm-3"> </div> </div> </div> </section>
I grew to dislike the majority of captchas out there and decided a number of years ago to come up with one that was reasonably easy to read. Not defeated by OCR. And really required a person on the other end.
(5 times maximum)
Contact Me I am available at the below email.