Wednesday, May 27, 2020

Debugging TypeScript Phaser Apps Server and Client-side Using VS Code

I’m using the Phaser game development platform to teach my kids some video game design and programming concepts (I previously used LEGO to draw my daughter into building game sprites). As a new Phaser user, I'm finding there isn't enough writing out there about using the Phaser 3 HTML5 game development platform with TypeScript, so I feel I should add my 0.02c. My first contribution is about debugging with breakpoints in Phaser and Visual Studio Code (VSCode).

Note: You can read more about setting up VSCode for TypeScript debugging on the Microsoft site, but one tip is to remember you'll need to make sure you have "sourceMap": true in your tsconfig.json file.

Most likely, if you're using Phaser for a multi-player game, you're probably using Node.js and Express to fire up a server for the Phaser client-side code. I found this complicated debugging with breakpoints in VS Code, and I haven't been able to get a single configuration to hit breakpoints in both the server and client-side code. This makes sense, but I still thought it would be handled more easily.

If I launch a debugger for the server, any breakpoints I've set in my client-side code show "Breakpoint set but not yet bound" or the more direct error, "Breakpoint ignored because generated code not found (source map problem?)" My workaround is to launch the debugger depending on which breakpoints I need.

Server-side debugging

To debug the server, use VSCode attach to process and choose the dist/server.js process (server.js is the transpiled version of my server.ts file that starts the Node.js server).

VS Code server-side breakpoint Geeklit Blog

Client-side debugging

To debug the client (uses Visual Studio Code Chrome debugger extension), I use VSCode to launch the debugger with the Launch client-side task.

Here is the entire launch.json file that supports both client and server-side debugging with breakpoints: 

    // Use IntelliSense to learn about possible attributes.
    // Hover to view descriptions of existing attributes.
    // For more information, visit:
    "version": "0.2.0",
    "configurations": [
            "name": "Launch client-side: localhost:8080",
            "type": "chrome",
            "request": "launch",
            "url": "http://localhost:8080/index.html",
            "webRoot": "${workspaceFolder}/dist/client"
            "name": "Attach to Process dist/server.js",
            "type": "node",
            "request": "attach",
            "processId": "${command:PickProcess}",
            "port": 8080

Tuesday, April 21, 2020

Running NPM in Windows Terminal Ubuntu profile

I got some nasty errors when I tried to use NPM and Node.js in the new Windows Terminal for Windows 10 (which BTW has a cool split-pane feature that might just make it my new goto terminal).

The errors looked like this:

 not foundram Files/nodejs/npm: 3: /mnt/c/Program Files/nodejs/npm:
: not foundram Files/nodejs/npm: 5: /mnt/c/Program Files/nodejs/npm:
/mnt/c/Program Files/nodejs/npm: 6: /mnt/c/Program Files/nodejs/npm: Syntax error: word unexpected (expecting "in")

Here's the fix

$ sudo apt-get update
$ sudo apt-get install -y nodejs
$ sudo apt-get install build-essential
$ sudo apt get install npm

I found it here:

BTW -- example split-pane keys: Alt+Shift+-, Alt+Shift+= will split horizontally and vertically.

Tuesday, December 10, 2019

My New Gig:

I'm super excited to have taken a new position at of the fastest-growing software start-ups in Vancouver. is developing the Ops Platform for DevOps and was recently featured on TechCrunch after closing a funding round with Slack and Tiger Global.

I'm also happy to be back in Gastown. This is at least the fourth software company I've worked at that had a Gastown office, and is literally across the street from the first one--NCompass Labs. Being back in the neighbourhood and working with such a promising team feels new and nostalgic at the same time.

From the TechCrunch article: lets developers build and borrow DevOps shortcuts. These automate long series of steps they usually have to do manually, thanks to integrations with GitHub, AWS, Slack and more. claims it can turn a days-long process like setting up a Kubernetes cluster into a 15-minute task even salespeople can handle. The startup offers both a platform for engineering and sharing shortcuts, and a service where it can custom build shortcuts for big customers.

“Building tools that streamline software development is really expensive for companies, especially when they need their developers focused on building features and shipping to customers,” [CEO, Kyle] Campbell tells me. The same way startups don’t build their own cloud infrastructure and just use AWS, or don’t build their own telecom APIs and just use Twilio, he wants to be the “easy button” for developer tools. DevOps Ops Platform example

Sunday, June 16, 2019

Minecraft Bonding With My Daughter

I hadn't played much Minecraft since a co-worker showed me an original beta version, but I knew enough about the best selling PC game to suggest that my creative daughter give it a try. I thought it would be a great outlet for her crafty passion and also a nice way to get some more exposure to computers. At first, we tried Creative mode and having just watched The Swiss Family Robinson, we jumped into building a treehouse. We spent a reasonable amount of time working on that world and then one (or both?) of us suggested we try Survival mode.

Survival mode is a real-time strategy game and is therefore highly addictive. Before we knew it, we were using any time we could to build up our world and it got quite intense. The first time my daughter fell into lava, she was distraught because she had lost everything she was carrying: tools, weapons, supplies, etc. I started to regret ever leaving Creative mode, but things calmed down and we're now at a reasonable place where it's just one of the things we do and it's not such a big deal if someone goes swimming in lava.

Here's a screenshot of the first survival house we built that wasn't just a shelter from monsters.

Minecraft house cawood blog

The best part is that she got interested in Redstone engineering. For anyone who doesn't know, Redstone is the Minecraft way to create circuits that can power machines, traps, and other things in Minecraft. My wife suggested a lighthouse would be cool, so we build a functioning lighthouse together using Redstone. She even has her own Creative mode test world where she tries out Redstone inventions before she uses them in her other worlds. It's super cool.

I highly recommend this sort of cooperative play with your kids--just watch those screen time limits. We've thoroughly enjoyed the experience and my daughter has even started doing a Minecraft after school program.

Sunday, April 08, 2018

Deploying Python app to Google Cloud Platform (GCP) fails for numpy==1.9.3

I'm not sure what caused this issue. I've had this GCP Python app running for many months, but today the deploy command failed with this error.

$ gcloud app deploy ~/appFolder/app.yaml ~/appFolder/cron.yaml --verbosity=error

Step #1:   Could not find a version that satisfies the requirement numpy==1.9.3 (from versions: 1.14.5, 1.14.6, 1.15.0rc2, 1.15.0, 1.15.1, 1.15.2, 1.15.3, 1.15.4, 1.16.0rc1, 1.16.0rc2, 1.16.0, 1.16.1)
Step #1: No matching distribution found for numpy==1.9.3
Step #1: You are using pip version 10.0.1, however version 19.0.1 is available.
Step #1: You should consider upgrading via the 'pip install --upgrade pip' command.
Step #1: The command '/bin/sh -c pip install -r requirements.txt' returned a non-zero code: 1
Finished Step #1
ERROR: build step 1 "" failed: exit status 1
Step #1: 

ERROR: ( Cloud build failed. 

To resolve the issue, I changed this line in requirements.txt from:



pandas; python_version >= '3.5'
pandas<0.21; python_version == '3.4'

Then I ran:

$ pip install  -r requirements.txt

I found this solution in a comment on this GitHub issue:

Saturday, March 31, 2018

Using PS3 wireless controllers with SUMOSYS

I got some inexpensive unofficial PS3 wireless controllers for my SUMOSYS retro game console and the standard setup instructions didn't work. Here's how I got them working with SUMOSYS 700.

1. Connect one working default controller and one new wireless controller (with the USB cable).
(In my case the wireless controller was constantly vibrating--don't worry about this.)
2. Go to the menu, choose controller setup and select the buttons. The controller should now work with the cable attached (but possibly still vibrating).
3. Detach the cable from the new controller and go to the controller menu again with the controller flashing for Bluetooth connection. I didn't have to do anything else here--it just found it.
4. The wireless controller was now working, so I unplugged the old wired controller, went to the controller menu and assigned the new controller to P1 (player 1 in games).
5. Used the menu to shut down. This saves the first controller.
Repeat the process for the other controllers, but you can skip connecting the wired controller since you have a working wireless controller to navigate through the menus.

I tried 1942 and Mario Kart and they work great so far. Big improvement!

Note: to exit games, I use Start + Hotkey (which I set to the Home button)

Tuesday, February 13, 2018

How to Create a Kid-safe YouTube Playlist

Now that I've got the Beta of Hive Video live, I'd like some feedback, so I'm getting the word out about some of the best use cases for this free application. While it's true that Hive Video is the simple alternative to video editing, it's also true that there's more to the app than creating free highlight reel playlists from YouTube videos.

free kid-safe video playlists on tunnel video

Two use cases that are perfect for Hive Video are easily creating replay highlights for sports teams and kid-friendly video playlists for parents. In this post, we'll dive into the second one...

As a stay-at-home dad, I'm well aware of one of the modern parenting difficulties: limiting screen time and controlling what your kids are watching. Hive Video highlight reels solve many issues:

  • since you only add the videos you want, you always know what your kids are watching
  • you can set the start and end time of each clip, so you also know exactly which part of each video your children are seeing
  • all of the content is streaming from YouTube, but since you're not on the YouTube site (or app), there are no comments and no recommended videos
tunnel video free highlight reel playlists for YouTube
Even the YouTube Kids app does not address these issues because suggested videos are easily accessible to little fingers. It drives my wife crazy when our kids are watching a video and they start tapping around to open up other videos. Even if you're sitting next to them, they can do it too quickly for you to stop them, and then they want to watch those other videos they discovered. In our house, it's the unboxing videos that always used to come up--that is before we created Hive Video playlists.

Try the free Hive Video app and I recommend you register (also free) so you can manage multiple highlight reels under your account.

Tuesday, January 30, 2018

AvePoint Takes Fire for Unethical Marketing

Having worked in the Microsoft SharePoint space for many years, I know that it's unusual for SharePoint Independent Software Vendors (ISVs) to get much mainstream press coverage, so I was surprised to see a Washington Business article about an AvePoint marketing campaign. The article, All's fair in sales and marketing? Competitor blogs that Metalogix is for sale, tries to poach customers, highlights a shady marketing tactic recently employed by AvePoint against my former employer, Metalogix.

(Update: The Metalogix response, "Metalogix is Forever" has been posted now.)

Others have written about this dishonourable marketing move (for example: An Open Letter to AvePoint | An Unethical Marketing Campaign), and I can understand their frustration. The SharePoint partner community used to be a pretty tight-knit group focused on "coopetition." Sure, we were competing, but we also worked together to raise the profile of SharePoint--which BTW was a great strategy that paid off for everyone. As they say, 'a rising tide raises all boats.'

But that's just not the case anymore, as money poured into the situation, investors took notice and the atmosphere changed. A win at all costs mentality emerged from some companies and the whole situation was a lot less interesting for many of us who started in content management before the SharePoint era. I suspect this marketing ploy will backfire.

Sunday, December 31, 2017

Hive Video - Free Highlight Reel Creator for YouTube is Live!

I'm thrilled to return to blogging after a crazy few months working on my web app for YouTube videos: Hive Video. The Beta is live so head on over and you can create free highlight reels from YouTube videos.

Hive Video is the simple alternative to video editing. Whether you're looking to highlight the best parts of a tutorial, put together the best moments of various videos (e.g., sports), or just create a kid-friendly playlist without comments, ads, or recommended videos--it's easy, free, and doesn't require any install. 

Once you've created your highlight reel or mashup of YouTube videos, you can quickly share the results with a public view URL. Send it via social media, email, or whatever, it's just plain easy.

Check it out! I'd appreciate any feedback.

My GitHub heat map for 2017 tells the story pretty clearly--I started the project in April.

Cawood Tunnel Video GitHub heat map

This new project is under the umbrella of my software startup: Second Marker.

Update: we now support Twitch VOD videos as well.

Tuesday, November 14, 2017

How to View Files in a Google App Engine Docker Container

This post has been adapted (abridged) from the Google Cloud Platform (GCP) Debugging an Instance documentation. I wanted to see the file structure that was being deployed to my Angular 5 application for my YouTube video highlight web app (Tunnel Video). These are the steps required.

1. Go to your App Engine instances in the GCP console

2. Find the instance and choose the SSH option to open a terminal in a browser window

3. Once the terminal window for the instance opens, list the docker containers: $ sudo docker ps

4. Find the docker container in the list with your project name in it and then run: $ container_exec containerId /bin/bash

5. This will open a shell in your container. From there, just list your files as usual: ls app

Wednesday, October 18, 2017

Postman with Authenticated Google API HTTP Requests

Let's say I want to send an HTTP GET request to a Google API using the super helpful Postman application. That's simple enough, I'll just choose GET and enter the URL. In this example, I'll search YouTube for videos with "test" in their details.

Breaking down this URL, you'll see that the query ("q") for "test" is in there and I'm asking for only the 'snippet' details of videos to be returned ("type=video"). Without the type optional parameter, the search would also return channels.

The only cumbersome thing about this GET request is that I need a Google API Key. These keys allow Google to limit how many API requests any account is making--and potentially charge the user when the count gets too high. To get a key, you need to a Google Cloud Platform (GCP) account and then follow the instructions to create an API key from the API Manager section.

Since API keys are--by definition--limited, most people try to keep them private and restrict who can use them. There are a few different ways to add API Key Restrictions in GCP. "HTTP referrers (websites)" is a popular and straightforward option. For example, if you were using the key from a local website, you could add http://localhost:/* to the list.

Note: If you don't care about your key being used by others, then you can leave the key with no restrictions and skip the next part.

In Postman, you can pretend you're sending the request from a local website by adding a "Referer" header entry. However, since it's a restricted header, there is an extra step. You must turn on Postman Interceptor by clicking the Interceptor button at the top of the window (next to the sync button). If you have interceptor off, the Referer header entry will be ignored and you'll get an error: "Error 403:The request did not specify any referer. Please ensure that the client is sending referer or use the API Console to remove the referer restrictions."

Now you're sending the GET request with an API key and you're getting back JSON results that look like this:

    "kind": "youtube#searchListResponse",
    "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/71Y1Pa_Vox_0ZzzjdbBNppwdf0s\"",
    "nextPageToken": "CAUQAA",
    "regionCode": "CA",
    "pageInfo": {
        "totalResults": 1000000,
        "resultsPerPage": 5
    "items": [
            "kind": "youtube#searchResult",
            "etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/WePlVVP0Z4fWK6zl92pA9jVLbdQ\"",
            "id": {
                "kind": "youtube#video",


For more about the Google API options for YouTube Search, refer to the YouTube developer documentation.

Just that query alone is useful, but there's still one key thing missing from our request--authentication. I'm not going to go into detail about OAuth authentication here--you can read about that elsewhere--so this section will just follow the basic steps. To use the Google APIs as an authenticated user, you need an OAuth token.

There are a few ways to get a Google OAuth token. Probably the simplest option is to use Google's own OAuth playground. This is a super useful app that allows you to fiddle with all sorts of settings.

Another option is to use Postman's built in Get New Access Token feature. To use it, click on Authorization (next to the Headers tab) and then the "Get New Access Token" button. Here you would enter the Auth URL as and the Access Token URL as You also need to know the Scope for your request--which you probably just want to go to Google's OAuth Playground to get anyway.

Once you have an access token, a simple method to check its validity is to paste this URL into a browser address bar:

Note: Checking the token will show its expiration time and also its scopes.

Tuesday, September 19, 2017

Google Cloud Platform for a Full-stack Angular Web Application

At first blush, getting a complete Angular web application deployed and running on the Google Cloud Platform (GCP) can be a daunting task. Each quickstart tutorial is reasonable enough, but if you read ahead through all of the various documentation pages you'll need, they can soon begin to snowball.

UPDATE: After I started writing this post, I discovered a Google lab that covers the same topic: Build a Node.js & Angular Web App Using Google Cloud Platform.

I'll just quickly provide an overview of deploying Angular as the front-end app (on Google App Engine) and MySQL with a Node.js API on the backend. MySQL will run on Cloud SQL and Node.js runs on App Engine.

The basic steps involved are:

1. Set up a GCP account and project

2. Download the Angular sample and install the requirements

3. Deploy your Angular front-end app to GCP

4. Set up a MySQL DB on Cloud SQL. I wrote a whole post about setting on MySQL on Google Cloud SQL.

5. Complete the tutorial for using Google Cloud with Node.js

Since these samples aren't linked, you'll have to test the pieces separately until you develop some interaction in your Angular app. You can use Postman to test your Node.js API.

Monday, August 14, 2017

Set up MySQL on Google Cloud Platform

Here's a walkthrough of how to set up MySQL on Google Cloud Platform's (GCP) Cloud SQL service.

1. Create a Google account if you don't have one:

2. Sign up for a free trial on:

3. Create a GCP project to house your MySQL instance.

4. Create a Cloud SQL instance--obviously, choosing MySQL as the DB type.

5. Create a database in the instance by going to the Databases tab under the Instance. For this post, I'll create one called "test".

6. To add a schema to your DB, you can use the Cloud SQL command line, or MySQL Workbench. Of course, if you know SQL, you can just type out your schema, but if you want a visual representation of your DB, these tools are handy.

I like the simplicity and no-install aspect of, but I ran into a bug, so I don't use it as much as I would otherwise. Hopefully, they'll fix it soon. Until then, I guess I have to recommend MySQL Workbench since it's great in many ways.

7. If you just want to quickly test that your DB is working. You can run some simple queries in the Google Cloud Shell. To connect to your MySQL instance, open the shell (little command line icon in the top-right of the GCP dashboard.) and type in these commands:

googleusername@projectId:~$ gcloud beta sql connect dbinstance --user=dbusername (usually root for username)

You'll be prompted for your password and then you'll see the MySQL prompt. Note that it might take a few seconds before it appears.

Whitelisting your IP for incoming connection for 5 minutes...\
Enter password:

Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 1234
Server version: 5.7.14-google-log (Google)
Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.

8. To run your test queries, you'll need to select your DB within your instance.

mysql>; use test
Database changed

Once you've selected the DB, you can run a simple operation to create a table and then use SHOW TABLES; to see the results.

The command I'll use to create a simple table is:

CREATE TABLE `testtable` (

This creates a new table with one column called "id".

And this is what it looks like when I paste this command into the Cloud Shell:

mysql>; CREATE TABLE `testtable` (
    >; PRIMARY KEY (`id`)
    >; );
Query OK, 0 rows affected (0.03 sec)
mysql>; show tables;
| Tables_in_test |
| testtable      |
1 row in set (0.00 sec)

9. Now that you have a working DB, you'll want to connect to it from code and possibly from a MySQL client (e.g., the 'dolphin' a.k.a. MySQL Workbench).

To get the info you need, go to the Instance Details page in GCP and look at the Properties under the Overview tab. What you need is the "IPv4 address". Grab that, but you won't be able to connect to your instance yet. First, you have to allow connections.

To allow your client to connect to your DB in the Google cloud, you need to open Authorized Networks and add your network IP. To get your client IP, type "what's my IP" into Google.

10. Now you can access your DB from a MySQL client, but what about a local dev instance of a GCP App Engine application? For that you need a Google Cloud SQL Proxy running.

NOTE: The default setting is for all apps within the same project to be authorized, if you're using the DB from a different GCP project, you'll have to authorize it explicitly.

Without the Cloud SQL Proxy running, your local dev instance of your app will return the error: Error: connect ECONNREFUSED

You'll need to install the Google Cloud SDK (configuration will have you select the project). If you have multiple environments [e.g., test, dev, prod, etc. You may have to change this as some point with this command: cawood$ gcloud config set project projectName]), and enable the Cloud SQL API, if you haven't already. You'll also need a credential file to supply to the proxy. To get this file, create a service account in the IAM & Admin section of GCP. Make sure you grant the account rights to Cloud SQL. When you create the account, you'll be able to create a new private key and download a JSON credential file. The Cloud Proxy will read this credential file.

11. Once you have everything ready, here is the command to start the Cloud Proxy:
./cloud_sql_proxy -instances=instanceConnectionName=tcp:3306 \ -credential_file=credentials.json &

If all goes well, the Cloud Proxy command will return this output:

cawood$ 2017/04/23 22:34:39 Listening on for instanceConnectionName
2017/04/23 22:34:39 Ready for new connections

FYI: From the command line help:
  * On Google Compute Engine, the default service account is used.
    The Cloud SQL API must be enabled for the VM.

  * When gcloud is installed on the local machine, the "active account" is used
    for authentication. Run 'gcloud auth list' to see which accounts are
    installed on your local machine and 'gcloud config list account' to view
    the active account.

  * To configure the proxy using a service account, pass the -credential_file
    flag or set the GOOGLE_APPLICATION_CREDENTIALS environment variable. This
    will override gcloud or GCE credentials (if they exist).

Tuesday, July 11, 2017

Second Marker Software

My next project has begun. I’m working on a software startup called Second Marker.

Second Marker Software

Our first solution—and the inspiration for starting the company—is a free and easy alternative to video editing: Tunnel Video. If you want to make a highlight reel of YouTube videos, this will be the simplest solution.

There are endless possibilities for why someone would want to create and share a playlist of YouTube video with start and end times, but some use cases include:

Update: try the Beta for free at!

Wednesday, June 21, 2017

Google Cloud SQL Proxy on Same Machine as MySQL

As a web developer, it makes sense that you'd want to connect to my Google Cloud SQL instances running in Google Cloud Platform (GCP) via Cloud SQL Proxy and also have MySQL running locally on your machine. Unfortunately, the default configurations for these two systems will cause an error when you try to run the proxy: "bind: address already in use."

One clear reason to run the proxy is that it allows you to securely connect to your Cloud SQL instances with MySQL Workbench. You can get the IP (hostname) and port from GCP no problem, but to connect, you need to be running Google's proxy. (You also need to create a JSON credential file from GCP and add your IP to the Authorized Networks BTW.)

The reason for the error is that both the proxy and the local default install of MySQL will try to communicate on port 3306. When you try to start the proxy, this is the result:

doom:project cawood$ ./cloud_sql_proxy -instances=instancename-9999:us-west1:instance=tcp:3306 -credential_file=CredFileName-dflkjal.json &

doom:project cawood$ using credential file for authentication;
2017/06/05 23:51:24 listen tcp bind: address already in use

To resolve this error, one solution is to create a config file for the local MySQL server and change the port.

One thing to note (as per the MySQL documentation) is that the installer for macOS will not create a config file: "Note As of MySQL 5.7.18, my-default.cnf is no longer included in or installed by distribution packages." If you don't have a config file already... to create the config file, simply run these commands.

$ cd /etc
$ sudo nano my.cnf

This is all you need in the file:

port = 3366

port = 3366

Once you're done use CTRL + o to save and exit Nano.

It's worth noting, that I used to see this error before I installed MySQL locally. The reason was that there was another Cloud SQL Proxy process ("cloud_sql_proxy") already running. The solution to that is simply to kill the process.

Thursday, May 04, 2017

Wednesday, April 12, 2017

Node.js Authentication Error Connecting to Google Cloud SQL Proxy

I was receiving this error trying to run a local instance of a Node.js API against a Google Cloud Platform (GCP) database.

Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''@'cloudsqlproxy~' (using password: NO) at Handshake.Sequence._packetToError (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Handshake.ErrorPacket (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Handshake.js:103:18) at emitOne (events.js:96:13) at Socket.emit (events.js:191:7) at readableAddChunk (_stream_readable.js:178:18) at Socket.Readable.push (_stream_readable.js:136:10) 

The solution was quite straightforward, but when I first googled the error, I couldn't find anything about the access denied error showing no user--the username should be in the error (i.e. 'username'@'cloudsqlproxy).

I thought the error was the GCP Cloud SQL Proxy, but I was initiating it correctly and with a valid credential file for a service account:

$ ./cloud_sql_proxy -instances=name-111111:us-central1:instancename=tcp:0000 \ -credential_file=serviceAccountCreds.json &

The problem was actually with the environment variables for my local instance of the Node.js API. I hadn't exported them. Of course, the error was correct, I wasn't trying to connect with any user at all.

$ export MYSQL_USER="username" 
$ export MYSQL_PASSWORD="password" 
$ export MYSQL_DATABASE="test" 
$ npm start

To check if you have set these correctly, you can output them to the console when you start the server: console.log(config);

cawood$ npm start 
> projectapi@0.0.1 start /Users/cawood/GitHub/project
> node server.js 
 { user: 'username', password: 'password', database: 'test' } 
App listening on port 8000 Press Ctrl+C to quit.

Friday, March 10, 2017

Google Acquires AppBridge Software

At the Google Cloud Next conference yesterday it was announced that Google has acquired AppBridge. I've been at AppBridge for just over a year and I'm thrilled to be a part of this Vancouver software success story.

AppBridge is a tremendous addition to the Google solutions because the founders designed the AppBridge Transformation Suite from the ground up for performance and the cloud. AppBridge is already handling the world's largest G Suite and Google Drive migrations.

Here is a quote from Google's announcement:

"Migrating to the cloud can be complex. It's not just your files that need to be moved; permissions also need to map correctly; content likely needs to be reorganized, and some data probably needs to be archived. To address that challenge, today we are announcing the acquisition of AppBridge, an enterprise-grade, G Suite migration tool that helps organizations seamlessly migrate from their on-prem, cloud-based and hybrid solutions to Google Drive.

With AppBridge, your organization can migrate files effortlessly to G Suite from your existing file servers or content management systems like SharePoint, or from many other cloud platforms you might be using. File permissions are also brought over when you migrate, which means your team's file access remains unchanged and your data stays safe. We’re working together with AppBridge to bring them into the G Suite team. Stay tuned for more information in the near future."

As for me, I'm going to take some time to think about what I want for my next adventure, but this one was certainly memorable.

TechCrunch Article: Google acquires AppBridge to help enterprises move their files to its cloud services

Wednesday, February 15, 2017

Google Sheets Split into Columns

I'm very much appreciating Google Sheets' split into columns feature. Ya, I know, it's not like it hasn't been done before, but man it's nice to have when you need it.

To split text into columns using separator options (including space, comma, custom, etc.):
  1. Open a spreadsheet in Google Sheets.
  2. Paste the data you want to split into columns.
  3. In the bottom right corner of your data, click the Paste icon.
  4. Click Split text to columns. Your data will split into different columns.
  5. To change the delimiter, in the separator box, click Comma to open the dropdown of options.

Wednesday, January 18, 2017

Building Video Games with My Daughter and Lego

I asked my (then) 4-year-old daughter if she wanted to make a video game, and her response was exactly what I expected, "We can't make a game." "Actually, " I said, "we can! Want to do it?" She emphatically said yes, so we set out to make an iOS game. Of course, I had already done some playing around with Unity 3D, so I knew that a 2D game wouldn't be too much work.

I wanted her to have the best experience possible, so I devised a plan to take her ideas and convert them into the game without changing much along the way. I used a sample 2D game as a starting point so we wouldn't have to spend too much time writing code before she saw her creations in a working game. Note that if you don't deploy your app to the store, you can do all of this for free (plus the cost of LEGO of course).

We started by drawing out the main elements of the game by hand. Here are the villain and hero characters as my daughter drew them.

After that, it was off to the LEGO store so we could get enough 2x2 pieces to build whatever we wanted. Here's the villain LEGO version my daughter created from her original drawing.

We designed the rest of the game elements including a tree and an egg, and then I used photos of the LEGO concepts to convert her creations to simple sprite versions. Here's another character before reducing the pixels down to a simple sprite.

My daughter building LEGO versions of the game sprites.

This is a screenshot from the first build I deployed to my iPhone (just for fun, I threw in a photo of my daughter as the player). Not bad for a 4-year-old with some LEGO!

We had a great time on this project and I was prompted to write this post when she recently asked if we could play the game again.