Tuesday, October 05, 2021
10 Things I Undervalued When I Started Playing Minecraft
Monday, April 05, 2021
Reel BJJ Public Beta is live!
I'm super excited to have opened up the web application version of Reel BJJ for public Beta feedback. Reel BJJ is a video management application for athletes and coaches who want to get the most out of their Brazilian jiu-jitsu videos. (Try it now--there's a free trial.)
Why create a jiu-jitsu video app? The answer is simple. There is an amazing library of jiu-jitsu content available from a number of sites such as BJJ Fanatics, GrappleArts, Grapplers Guide, BJJ Library, and Gracie University--the list goes on. In fact, in a recent interview with Bernardo Faria on BJJ Fanatics, Mikey Muscimeci made exactly this point. The content available to people wanting to learn techniques is seemingly endless. However, the way video is used for learning jiu-jitsu hasn't changed--most people just watch an instructional through and hope they remember the best parts. The mission of Reel BJJ is to change the way people think about learning jiu-jitsu via video.
In short, Reel BJJ can help everyone (athletes, instructors, coaches, or hobbyists) get the most value possible from video content. The first main feature of the Reel BJJ Beta release is called Reels (and before anyone gets upset with me, I created Reels a while back, so yes, it was before Instagram stole my thunder). Reels allow anyone to quickly (and easily) create highlights from their jiu-jitsu videos and then just watch the parts of the videos they want to review. (You can see how Reels work in this short walkthrough video of the Reel BJJ Beta.)
Some of the most common scenarios for Reels would include:
- Techniques you'd like to learn
- Techniques you'd like to try before class
- Review footage of your sparring rounds
- Review footage of your competition matches
- Researching opponents for upcoming competitions
- Creating a game plan of techniques
The key, of course, is that Reel BJJ users can do this without any software install or video editing. Users can add videos from YouTube, Vimeo, or even upload their own videos, and then create highlights. Naturally, you'll want to add notes to your highlights, so Reel BJJ has a cool notes features that was just shipped.
It's important to note that Reel BJJ does not download videos from streaming services, or violate their terms of service in any way. The videos remain at the source and aren't altered in any way.
I'm excited to see what will happen. The Beta feedback so far has been fantastic. For example, here's some feedback from Thomas Lisboa."It's unbelievable how jiu-jitsu is constantly evolving. That's why I was so excited when I discovered Reel BJJ. I can select the parts I liked most in a course and review them. For those who like to study jiu-jitsu, regardless of their level and goals, Reel BJJ is the evolution of jiu-jitsu study." - Thomas Lisboa (Head Coach of Alliance BJJ Vancouver)
Wednesday, November 11, 2020
Group Policy Prevents Microsoft Apps Working on Windows 10
I recently ran into an odd problem on my home Windows 10 PC. Even though the machine is my home PC and isn't part of a domain, I started getting a Group Policy error any time I tried to use a program that requires a Microsoft account. This is the error:
"This program is blocked by group policy. For more information contact your system administrator. 0x800704ec"
This error can affect any app or program that uses your Microsoft account. This includes things like the Microsoft Store, the Xbox app, the Feeback app, and even Minecraft.
This error happens when your Microsoft account has been linked to a work or school account that does have a group policy that's preventing the app from running properly. To fix the issue, disassociate the problematic account from your home Microsoft account.
To remove the account, go to Settings > Accounts > Access Work or School. Find any reference to a work or school account associated with your account and choose disconnect. Remember, this is all assuming that you don't mind not having that account associated with your home account.
Tuesday, October 27, 2020
Cypress.io testing with Auth0 and Angular 10
I read the documentation on both the Auth0 site and the Cypress.io site about using the two technologies together with the Angular framework, but I simply couldn't get it to work.
For example, the tutorial on the Auth0 Blog called The Complete Guide to Angular User Authentication with Auth0 got me most of the way there and the same was true for the Cypress.io "real-world app" code and documentation on GitHub (which is written for React BTW).
These two resources (plus some community posts) allowed me to set up all the underpinnings of a working test system but didn't resolve all my issues. In the past, my Angular project exclusively used Google Authentication, so my Cypress tests required me manually login first, and then run all the tests that needed an authenticated user. It wasn't perfect, but it was workable. However, his manual login option did not work with Auth0 (and Angular 10). Google Authentication would not allow the login window to open in an iFrame when running in the context of a Cypress test (i.e., Cypress was automating/controlling the browser).
This is what I ultimately did to get it working:
1. Switch my Node.js API from Google tokens to Auth0 authentication tokens by following the Auth0 quickstart for Node.js (Express) backends.
2. Set up a test SPA Application in Auth0 with lower security than my production app. This allowed me to enable username/password logins and turn on the Password Grant Types (under Application > Advanced Settings).
3. Follow the Cypress real-world app section called "Cypress Setup for Testing Auth0) to add a Login function to my Cypress.io setup.
Under Cypress > support > commands.ts, I now have the code below. You can also use plain JavaScript of course. The configuration for the variables is found under src > cypress.env.json.
/// <reference types="cypress" />
/// <reference types="jwt-decode" />
import jwt_decode from 'jwt-decode';
Cypress.Commands.add('login', (overrides = {}) => {
const username = Cypress.env('auth0_username');
cy.log(`Logging in as ${username}`);
cy.request({
method: "POST",
url: Cypress.env('auth0_url'),
body: {
grant_type: 'password',
username: Cypress.env('auth0_username'),
password: Cypress.env('auth0_password'),
audience: Cypress.env('auth0_audience'),
scope: Cypress.env('auth0_scope'),
client_id: Cypress.env('auth0_client_id'),
client_secret: Cypress.env('auth0_client_secret'),
},
}).then(({ body }) => {
const claims: any = jwt_decode(body.id_token);
const { nickname, name, picture, updated_at, email, email_verified, sub, exp } = claims;
const item = {
body: {
...body,
decodedToken: {
claims,
user: {
nickname,
name,
picture,
updated_at,
email,
email_verified,
sub,
},
audience: '',
client_id: '',
},
},
expiresAt: exp,
};
window.localStorage.setItem('auth0Cypress', JSON.stringify(item));
return body;
});
});
let LOCAL_STORAGE_MEMORY = {};
Cypress.Commands.add("saveLocalStorageCache", () => {
Object.keys(localStorage).forEach(key => {
LOCAL_STORAGE_MEMORY[key] = localStorage[key];
});
});
Cypress.Commands.add("restoreLocalStorageCache", () => {
Object.keys(LOCAL_STORAGE_MEMORY).forEach(key => {
localStorage.setItem(key, LOCAL_STORAGE_MEMORY[key]);
});
});
4. With this command added to Cypress.io, I can now login programmatically to Auth0 using the following code in my first test. You'll note that the authenticated user details are being written in local storage (but only during testing), and the Auth0 authenticated token is being stored as a cookie--this is the token that I use with my backend API.
describe('Login', () => {
beforeEach(() => {
cy.restoreLocalStorageCache();
});
it('Should successfully login', () => {
cy.login2()
.then((resp) => {
return resp;
})
.then((body) => {
const {access_token, expires_in, id_token} = body;
const auth0State = {
nonce: '',
state: 'some-random-state'
};
// write access token to user-token cookie
cy.setCookie('user-token', access_token);
const callbackUrl = `/callback#access_token=${access_token}&scope=openid&id_token=${id_token}&expires_in=${expires_in}&token_type=Bearer&state=${auth0State.state}`;
cy.visit(callbackUrl, {
onBeforeLoad(win) {
win.document.cookie = 'com.auth0.auth.some-random-state=' + JSON.stringify(auth0State);
}
});
})
});
afterEach(() => {
cy.saveLocalStorageCache();
});
});
6. This all seemed to work great, however, Auth0 still would not recognize that the user is authenticated. The Auth0 client would always return false for isAuthenticated. To get around this issue, I had to hack my AuthGuard.
This is what I had before adding Cypress.io:
return this.auth.isAuthenticated$.pipe(
tap(loggedIn => {
if (!loggedIn) {
this.auth.login(state.url);
}
})
);
To get around the issue, I simple added another option. If the code is being run by Cypress, I check for the stored user credential and token. I even added a check that it's the right authenticated user, but that's really not needed.
// @ts-ignore
if (window.Cypress) {
const auth0credentials = JSON.parse(localStorage.getItem("auth0Cypress")!);
const user = auth0credentials.body.decodedToken.user;
const access_token = auth0credentials.body.access_token;
if(user.name === 'youtestuser@yourdomain.com' && access_token) {
return true;
} else {
return this.auth.isAuthenticated$.pipe(
tap(loggedIn => {
if (!loggedIn) {
} else {
this.auth.login(state.url);
}
})
);
};
} else {
return this.auth.isAuthenticated$.pipe(
tap(loggedIn => {
if (!loggedIn) {
this.auth.login(state.url);
}
})
);
}
Well, I think that's everything. I hope there is a better answer coming as this hack around the AuthGuard is not a prefect solution, but it does let me move forward and that's promising.
Wednesday, May 27, 2020
Debugging TypeScript Phaser Apps Server and Client-side Using VS Code
Note: You can read more about setting up VSCode for TypeScript debugging on the Microsoft site, but one tip is to remember you'll need to make sure you have "sourceMap": true in your tsconfig.json file.
Most likely, if you're using Phaser for a multi-player game, you're probably using Node.js and Express to fire up a server for the Phaser client-side code. I found this complicated debugging with breakpoints in VS Code, and I haven't been able to get a single configuration to hit breakpoints in both the server and client-side code. This makes sense, but I still thought it would be handled more easily.
If I launch a debugger for the server, any breakpoints I've set in my client-side code show "Breakpoint set but not yet bound" or the more direct error, "Breakpoint ignored because generated code not found (source map problem?)" My workaround is to launch the debugger depending on which breakpoints I need.
Server-side debugging
To debug the server, use VSCode attach to process and choose the dist/server.js process (server.js is the transpiled version of my server.ts file that starts the Node.js server).Client-side debugging
To debug the client (uses Visual Studio Code Chrome debugger extension), I use VSCode to launch the debugger with the Launch client-side task.Tuesday, April 21, 2020
Running NPM in Windows Terminal Ubuntu profile
The errors looked like this:
not foundram Files/nodejs/npm: 3: /mnt/c/Program Files/nodejs/npm:
: not foundram Files/nodejs/npm: 5: /mnt/c/Program Files/nodejs/npm:
/mnt/c/Program Files/nodejs/npm: 6: /mnt/c/Program Files/nodejs/npm: Syntax error: word unexpected (expecting "in")
Here's the fix
$ sudo apt-get update
$ sudo apt-get install -y nodejs
$ sudo apt-get install build-essential
$ sudo apt get install npm
**CLOSE AND REOPEN TERMINAL**
I found it here: https://github.com/microsoft/WSL/issues/1512
BTW -- example split-pane keys: Alt+Shift+-, Alt+Shift+= will split horizontally and vertically.
Sunday, June 16, 2019
Minecraft Bonding With My Daughter
I hadn't played much Minecraft since a co-worker showed me an original beta version, but I knew enough about the best selling PC game to suggest that my creative daughter give it a try. I thought it would be a great outlet for her crafty passion and also a nice way to get some more exposure to computers. At first, we tried Creative mode and having just watched The Swiss Family Robinson, we jumped into building a treehouse. We spent a reasonable amount of time working on that world and then one (or both?) of us suggested we try Survival mode.
Survival mode is a real-time strategy game and is therefore highly addictive. Before we knew it, we were using any time we could to build up our world and it got quite intense. The first time my daughter fell into lava, she was distraught because she had lost everything she was carrying: tools, weapons, supplies, etc. I started to regret ever leaving Creative mode, but things calmed down and we're now at a reasonable place where it's just one of the things we do and it's not such a big deal if someone goes swimming in lava.
Here's a screenshot of the first survival house we built that wasn't just a shelter from monsters.
The best part is that she got interested in Redstone engineering. For anyone who doesn't know, Redstone is the Minecraft way to create circuits that can power machines, traps, and other things in Minecraft. My wife suggested a lighthouse would be cool, so we build a functioning lighthouse together using Redstone. She even has her own Creative mode test world where she tries out Redstone inventions before she uses them in her other worlds. It's super cool.
I highly recommend this sort of cooperative play with your kids--just watch those screen time limits. We've thoroughly enjoyed the experience and my daughter has even started doing a Minecraft after school program.
Sunday, April 08, 2018
Deploying Python app to Google Cloud Platform (GCP) fails for numpy==1.9.3
To resolve the issue, I changed this line in requirements.txt from:
to:
Then I ran:
I found this solution in a comment on this GitHub issue: https://github.com/pandas-dev/pandas/issues/20697
Saturday, March 31, 2018
Using PS3 wireless controllers with SUMOSYS
1. Connect one working default controller and one new wireless controller (with the USB cable).
(In my case the wireless controller was constantly vibrating--don't worry about this.)
2. Go to the menu, choose controller setup and select the buttons. The controller should now work with the cable attached (but possibly still vibrating).
3. Detach the cable from the new controller and go to the controller menu again with the controller flashing for Bluetooth connection. I didn't have to do anything else here--it just found it.
4. The wireless controller was now working, so I unplugged the old wired controller, went to the controller menu and assigned the new controller to P1 (player 1 in games).
5. Used the menu to shut down. This saves the first controller.
Repeat the process for the other controllers, but you can skip connecting the wired controller since you have a working wireless controller to navigate through the menus.
I tried 1942 and Mario Kart and they work great so far. Big improvement!
Note: to exit games, I use Start + Hotkey (which I set to the Home button)
Tuesday, February 13, 2018
How to Create a Kid-safe YouTube Playlist
As a stay-at-home dad, I'm well aware of one of the modern parenting difficulties: limiting screen time and controlling what your kids are watching. Hive Video highlight reels solve many issues:
- since you only add the videos you want, you always know what your kids are watching
- you can set the start and end time of each clip, so you also know exactly which part of each video your children are seeing
- all of the content is streaming from YouTube, but since you're not on the YouTube site (or app), there are no comments and no recommended videos
Tuesday, January 30, 2018
AvePoint Takes Fire for Unethical Marketing
(Update: The Metalogix response, "Metalogix is Forever" has been posted now.)
Others have written about this dishonourable marketing move (for example: An Open Letter to AvePoint | An Unethical Marketing Campaign), and I can understand their frustration. The SharePoint partner community used to be a pretty tight-knit group focused on "coopetition." Sure, we were competing, but we also worked together to raise the profile of SharePoint--which BTW was a great strategy that paid off for everyone. As they say, 'a rising tide raises all boats.'
But that's just not the case anymore, as money poured into the situation, investors took notice and the atmosphere changed. A win at all costs mentality emerged from some companies and the whole situation was a lot less interesting for many of us who started in content management before the SharePoint era. I suspect this marketing ploy will backfire.
Sunday, December 31, 2017
Hive Video - Free Highlight Reel Creator for YouTube is Live!
Tuesday, November 14, 2017
How to View Files in a Google App Engine Docker Container
1. Go to your App Engine instances in the GCP console
https://console.cloud.google.com/appengine/instances
2. Find the instance and choose the SSH option to open a terminal in a browser window
3. Once the terminal window for the instance opens, list the docker containers: $ sudo docker ps
4. Find the docker container in the list with your project name in it and then run: $ container_exec containerId /bin/bash
5. This will open a shell in your container. From there, just list your files as usual: ls app
Wednesday, October 18, 2017
Postman with Authenticated Google API HTTP Requests
The only cumbersome thing about this GET request is that I need a Google API Key. These keys allow Google to limit how many API requests any account is making--and potentially charge the user when the count gets too high. To get a key, you need to a Google Cloud Platform (GCP) account and then follow the instructions to create an API key from the API Manager section.
Since API keys are--by definition--limited, most people try to keep them private and restrict who can use them. There are a few different ways to add API Key Restrictions in GCP. "HTTP referrers (websites)" is a popular and straightforward option. For example, if you were using the key from a local website, you could add http://localhost:/* to the list.
Note: If you don't care about your key being used by others, then you can leave the key with no restrictions and skip the next part.
In Postman, you can pretend you're sending the request from a local website by adding a "Referer" header entry. However, since it's a restricted header, there is an extra step. You must turn on Postman Interceptor by clicking the Interceptor button at the top of the window (next to the sync button). If you have interceptor off, the Referer header entry will be ignored and you'll get an error: "Error 403:The request did not specify any referer. Please ensure that the client is sending referer or use the API Console to remove the referer restrictions."
Now you're sending the GET request with an API key and you're getting back JSON results that look like this:
{
"kind": "youtube#searchListResponse",
"etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/71Y1Pa_Vox_0ZzzjdbBNppwdf0s\"",
"nextPageToken": "CAUQAA",
"regionCode": "CA",
"pageInfo": {
"totalResults": 1000000,
"resultsPerPage": 5
},
"items": [
{
"kind": "youtube#searchResult",
"etag": "\"m2yskBQFythfE4irbTIeOgYYfBU/WePlVVP0Z4fWK6zl92pA9jVLbdQ\"",
"id": {
"kind": "youtube#video",
...
For more about the Google API options for YouTube Search, refer to the YouTube developer documentation.
Just that query alone is useful, but there's still one key thing missing from our request--authentication. I'm not going to go into detail about OAuth authentication here--you can read about that elsewhere--so this section will just follow the basic steps. To use the Google APIs as an authenticated user, you need an OAuth token.
There are a few ways to get a Google OAuth token. Probably the simplest option is to use Google's own OAuth playground. This is a super useful app that allows you to fiddle with all sorts of settings.
Another option is to use Postman's built in Get New Access Token feature. To use it, click on Authorization (next to the Headers tab) and then the "Get New Access Token" button. Here you would enter the Auth URL as https://accounts.google.com/o/oauth2/auth and the Access Token URL as https://accounts.google.com/o/oauth2/token. You also need to know the Scope for your request--which you probably just want to go to Google's OAuth Playground to get anyway.
Once you have an access token, a simple method to check its validity is to paste this URL into a browser address bar: https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=
Note: Checking the token will show its expiration time and also its scopes.
Tuesday, September 19, 2017
Google Cloud Platform for a Full-stack Angular Web Application
UPDATE: After I started writing this post, I discovered a Google lab that covers the same topic: Build a Node.js & Angular Web App Using Google Cloud Platform.
I'll just quickly provide an overview of deploying Angular as the front-end app (on Google App Engine) and MySQL with a Node.js API on the backend. MySQL will run on Cloud SQL and Node.js runs on App Engine.
The basic steps involved are:
1. Set up a GCP account and project
2. Download the Angular sample and install the requirements
3. Deploy your Angular front-end app to GCP
4. Set up a MySQL DB on Cloud SQL. I wrote a whole post about setting on MySQL on Google Cloud SQL.
5. Complete the tutorial for using Google Cloud with Node.js
Since these samples aren't linked, you'll have to test the pieces separately until you develop some interaction in your Angular app. You can use Postman to test your Node.js API.
Monday, August 14, 2017
Set up MySQL on Google Cloud Platform
1. Create a Google account if you don't have one: https://accounts.google.com
2. Sign up for a free trial on: https://console.cloud.google.com
3. Create a GCP project to house your MySQL instance.
4. Create a Cloud SQL instance--obviously, choosing MySQL as the DB type.
5. Create a database in the instance by going to the Databases tab under the Instance. For this post, I'll create one called "test".
6. To add a schema to your DB, you can use the Cloud SQL command line, https://dbdesigner.net or MySQL Workbench. Of course, if you know SQL, you can just type out your schema, but if you want a visual representation of your DB, these tools are handy.
I like the simplicity and no-install aspect of dbdesginer.net, but I ran into a bug, so I don't use it as much as I would otherwise. Hopefully, they'll fix it soon. Until then, I guess I have to recommend MySQL Workbench since it's great in many ways.
7. If you just want to quickly test that your DB is working. You can run some simple queries in the Google Cloud Shell. To connect to your MySQL instance, open the shell (little command line icon in the top-right of the GCP dashboard.) and type in these commands:
googleusername@projectId:~$ gcloud beta sql connect dbinstance --user=dbusername (usually root for username)
You'll be prompted for your password and then you'll see the MySQL prompt. Note that it might take a few seconds before it appears.
Whitelisting your IP for incoming connection for 5 minutes...\
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 1234
Server version: 5.7.14-google-log (Google)
Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql>;
8. To run your test queries, you'll need to select your DB within your instance.
mysql>; use test
Database changed
Once you've selected the DB, you can run a simple operation to create a table and then use SHOW TABLES; to see the results.
The command I'll use to create a simple table is:
CREATE TABLE `testtable` (
`id` INT NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`id`)
);
This creates a new table with one column called "id".
And this is what it looks like when I paste this command into the Cloud Shell:
mysql>; CREATE TABLE `testtable` (
>; `id` INT NOT NULL AUTO_INCREMENT,
>; PRIMARY KEY (`id`)
>; );
Query OK, 0 rows affected (0.03 sec)
mysql>; show tables;
+----------------+
| Tables_in_test |
+----------------+
| testtable |
+----------------+
1 row in set (0.00 sec)
mysql>;
9. Now that you have a working DB, you'll want to connect to it from code and possibly from a MySQL client (e.g., the 'dolphin' a.k.a. MySQL Workbench).
To get the info you need, go to the Instance Details page in GCP and look at the Properties under the Overview tab. What you need is the "IPv4 address". Grab that, but you won't be able to connect to your instance yet. First, you have to allow connections.
To allow your client to connect to your DB in the Google cloud, you need to open Authorized Networks and add your network IP. To get your client IP, type "what's my IP" into Google.
10. Now you can access your DB from a MySQL client, but what about a local dev instance of a GCP App Engine application? For that you need a Google Cloud SQL Proxy running.
NOTE: The default setting is for all apps within the same project to be authorized, if you're using the DB from a different GCP project, you'll have to authorize it explicitly.
Without the Cloud SQL Proxy running, your local dev instance of your app will return the error: Error: connect ECONNREFUSED xxx.0.0.xxx:3306.
You'll need to install the Google Cloud SDK (configuration will have you select the project). If you have multiple environments [e.g., test, dev, prod, etc. You may have to change this as some point with this command: cawood$ gcloud config set project projectName]), and enable the Cloud SQL API, if you haven't already. You'll also need a credential file to supply to the proxy. To get this file, create a service account in the IAM & Admin section of GCP. Make sure you grant the account rights to Cloud SQL. When you create the account, you'll be able to create a new private key and download a JSON credential file. The Cloud Proxy will read this credential file.
https://cloud.google.com/sql/docs/mysql/connect-external-app
11. Once you have everything ready, here is the command to start the Cloud Proxy:
./cloud_sql_proxy -instances=instanceConnectionName=tcp:3306 \ -credential_file=credentials.json &
If all goes well, the Cloud Proxy command will return this output:
cawood$ 2017/04/23 22:34:39 Listening on 127.0.0.1:3306 for instanceConnectionName
2017/04/23 22:34:39 Ready for new connections
FYI: From the command line help:
Authorization:
* On Google Compute Engine, the default service account is used.
The Cloud SQL API must be enabled for the VM.
* When gcloud is installed on the local machine, the "active account" is used
for authentication. Run 'gcloud auth list' to see which accounts are
installed on your local machine and 'gcloud config list account' to view
the active account.
* To configure the proxy using a service account, pass the -credential_file
flag or set the GOOGLE_APPLICATION_CREDENTIALS environment variable. This
will override gcloud or GCE credentials (if they exist).
Tuesday, July 11, 2017
Second Marker Software
Our first solution—and the inspiration for starting the company—is a free and easy alternative to video editing: Tunnel Video. If you want to make a highlight reel of YouTube videos, this will be the simplest solution.
There are endless possibilities for why someone would want to create and share a playlist of YouTube video with start and end times, but some use cases include:
- Playlists of training videos for athletes and coaches
- Highlights of tutorial videos
- Easier sharing of multiple videos
- Kid-friendly playlists with none of the usual YouTube complications such as comments and recommended videos
Wednesday, June 21, 2017
Google Cloud SQL Proxy on Same Machine as MySQL
One clear reason to run the proxy is that it allows you to securely connect to your Cloud SQL instances with MySQL Workbench. You can get the IP (hostname) and port from GCP no problem, but to connect, you need to be running Google's proxy. (You also need to create a JSON credential file from GCP and add your IP to the Authorized Networks BTW.)
The reason for the error is that both the proxy and the local default install of MySQL will try to communicate on port 3306. When you try to start the proxy, this is the result:
2017/06/05 23:51:24 listen tcp 127.0.0.1:3306: bind: address already in use
To resolve this error, one solution is to create a config file for the local MySQL server and change the port.
One thing to note (as per the MySQL documentation) is that the installer for macOS will not create a config file: "Note As of MySQL 5.7.18, my-default.cnf is no longer included in or installed by distribution packages." If you don't have a config file already... to create the config file, simply run these commands.
It's worth noting, that I used to see this error before I installed MySQL locally. The reason was that there was another Cloud SQL Proxy process ("cloud_sql_proxy") already running. The solution to that is simply to kill the process.
Thursday, May 04, 2017
May the Fourth be with You
Wednesday, April 12, 2017
Node.js Authentication Error Connecting to Google Cloud SQL Proxy
Error: ER_ACCESS_DENIED_ERROR: Access denied for user ''@'cloudsqlproxy~174.7.116.43' (using password: NO) at Handshake.Sequence._packetToError (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Sequence.js:52:14) at Handshake.ErrorPacket (/Users/cawood/GitHub/project/node_modules/mysql/lib/protocol/sequences/Handshake.js:103:18) at emitOne (events.js:96:13) at Socket.emit (events.js:191:7) at readableAddChunk (_stream_readable.js:178:18) at Socket.Readable.push (_stream_readable.js:136:10)
The solution was quite straightforward, but when I first googled the error, I couldn't find anything about the access denied error showing no user--the username should be in the error (i.e. 'username'@'cloudsqlproxy).
I thought the error was the GCP Cloud SQL Proxy, but I was initiating it correctly and with a valid credential file for a service account:
$ ./cloud_sql_proxy -instances=name-111111:us-central1:instancename=tcp:0000 \ -credential_file=serviceAccountCreds.json &
The problem was actually with the environment variables for my local instance of the Node.js API. I hadn't exported them. Of course, the error was correct, I wasn't trying to connect with any user at all.
$ export MYSQL_USER="username"
$ export MYSQL_PASSWORD="password"
$ export MYSQL_DATABASE="test"
$ npm start
To check if you have set these correctly, you can output them to the console when you start the server: console.log(config);
cawood$ npm start
> projectapi@0.0.1 start /Users/cawood/GitHub/project
> node server.js
{ user: 'username', password: 'password', database: 'test' }
App listening on port 8000 Press Ctrl+C to quit.