Node.js idiomatic client for Google Cloud Platform services.
This client supports the following Google Cloud Platform services at a General Availability (GA) quality level:
- Google BigQuery (GA)
- Cloud Datastore (GA)
- Cloud Natural Language (GA)
- Cloud Spanner (GA)
- Cloud Speech (GA)
- Google Stackdriver Logging (GA)
- Cloud Storage (GA)
- Cloud Translation API (GA)
- Cloud Video Intelligence (GA)
This client supports the following Google Cloud Platform services at a Beta quality level:
- Cloud Data Loss Prevention (Beta)
- Cloud Firestore (Beta)
- Cloud Pub/Sub (Beta)
- Cloud Vision (Beta)
- Google Stackdriver Monitoring (Beta)
This client supports the following Google Cloud Platform services at an Alpha quality level:
- Cloud Bigtable (Alpha)
- Cloud DNS (Alpha)
- Cloud Resource Manager (Alpha)
- Google Compute Engine (Alpha)
- Google Stackdriver Debugger (Alpha)
- Google Stackdriver Error Reporting (Alpha)
- Google Stackdriver Trace (Alpha)
If you need support for other Google APIs, check out the Google Node.js API Client library.
We recommend installing the individual packages that you need, which are provided under the @google-cloud namespace. For example:
$ npm install --save @google-cloud/datastore $ npm install --save @google-cloud/storagevarconfig={projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'};vardatastore=require('@google-cloud/datastore')(config);varstorage=require('@google-cloud/storage')(config);The meta-package, google-cloud, has been deprecated.
- nodejs-getting-started - A sample and tutorial that demonstrates how to build a complete web application using Cloud Datastore, Cloud Storage, and Cloud Pub/Sub and deploy it to Google App Engine or Google Compute Engine.
- gcloud-node-todos - A TodoMVC backend using google-cloud-node and Datastore.
- gitnpm - Easily lookup an npm package's GitHub repo using google-cloud-node and Google App Engine.
- gcloud-kvstore - Use Datastore as a simple key-value store.
- hya-wave - Cloud-based web sample editor. Part of the hya-io family of products.
- gstore-node - Google Datastore Entities Modeling library.
- gstore-api - REST API builder for Google Datastore Entities.
With google-cloud it's incredibly easy to get authenticated and start using Google's APIs. You can set your credentials on a global basis as well as on a per-API basis. See each individual API section below to see how you can auth on a per-API-basis. This is useful if you want to use different accounts for different Cloud services.
If you are running this client on Google Cloud Platform, we handle authentication for you with no configuration. You just need to make sure that when you set up the GCE instance, you add the correct scopes for the APIs you want to access.
varstorage=require('@google-cloud/storage')();// ...you're good to go! See the next section to get started using the APIs.If you are not running this client on Google Cloud Platform, you need a Google Developers service account. To create a service account:
- Visit the Google Developers Console.
- Create a new project or click on an existing project.
- Navigate to APIs & auth > APIs section and turn on the following APIs (you may need to enable billing in order to use these services):
- BigQuery API
- Cloud Bigtable API
- Cloud Bigtable Admin API
- Cloud Bigtable Table Admin API
- Cloud Spanner API
- Google Cloud Datastore API
- Google Cloud DNS API
- Google Cloud Firestore API
- Google Cloud Natural Language API
- Google Cloud Pub/Sub API
- Google Cloud Resource Manager API
- Google Cloud Speech API
- Google Cloud Storage
- Google Cloud Storage JSON API
- Google Cloud Translation API
- Google Cloud Vision API
- Google Compute Engine API
- Stackdriver Logging API
- Navigate to APIs & auth > Credentials and then:
- If you want to use a new service account key, click on Create credentials and select Service account key. After the account key is created, you will be prompted to download the JSON key file that the library uses to authenticate your requests.
- If you want to generate a new service account key for an existing service account, click on Generate new JSON key and download the JSON key file.
Follow the activation instructions to use the Cloud Datastore API with your project.
$ npm install --save @google-cloud/datastore vardatastore=require('@google-cloud/datastore');See Authentication.
vardatastoreClient=datastore({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});varkey=datastoreClient.key(['Product','Computer']);datastoreClient.get(key,function(err,entity){console.log(err||entity);});// Save data to Datastore.varblogPostData={title: 'How to make the perfect homemade pasta',author: 'Andrew Chilton',isDraft: true};varblogPostKey=datastoreClient.key('BlogPost');datastoreClient.save({key: blogPostKey,data: blogPostData},function(err){// `blogPostKey` has been updated with an ID so you can do more operations// with it, such as an update.blogPostData.isDraft=false;datastoreClient.save({key: blogPostKey,data: blogPostData},function(err){if(!err){// The blog post is now published!}});});$ npm install --save @google-cloud/language varlanguage=require('@google-cloud/language');See Authentication.
varlanguageClient=language({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});varcontent='Hello, world!';vartype=language.v1.types.Document.Type.PLAIN_TEXT;vardocument={content : content,type : type};languageClient.analyzeSentiment({document: document}).then(function(responses){varresponse=responses[0];// doThingsWith(response)}).catch(function(err){console.error(err);});$ npm install --save @google-cloud/storage varstorage=require('@google-cloud/storage');See Authentication.
varfs=require('fs');vargcs=storage({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Create a new bucket.gcs.createBucket('my-new-bucket',function(err,bucket){if(!err){// "my-new-bucket" was successfully created.}});// Reference an existing bucket.varbucket=gcs.bucket('my-existing-bucket');// Upload a local file to a new file to be created in your bucket.bucket.upload('/photos/zoo/zebra.jpg',function(err,file){if(!err){// "zebra.jpg" is now in your bucket.}});// Download a file from your bucket.bucket.file('giraffe.jpg').download({destination: '/photos/zoo/giraffe.jpg'},function(err){});// Streams are also supported for reading and writing files.varremoteReadStream=bucket.file('giraffe.jpg').createReadStream();varlocalWriteStream=fs.createWriteStream('/photos/zoo/giraffe.jpg');remoteReadStream.pipe(localWriteStream);varlocalReadStream=fs.createReadStream('/photos/zoo/zebra.jpg');varremoteWriteStream=bucket.file('zebra.jpg').createWriteStream();localReadStream.pipe(remoteWriteStream);$ npm install --save @google-cloud/translate vartranslate=require('@google-cloud/translate');See Authentication.
vartranslateClient=translate({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Translate a string of text.translateClient.translate('Hello','es',function(err,translation){if(!err){// translation = 'Hola'}});// Detect a language from a string of text.translateClient.detect('Hello',function(err,results){if(!err){// results ={// language: 'en',// confidence: 1,// input: 'Hello'// }}});// Get a list of supported languages.translateClient.getLanguages(function(err,languages){if(!err){// languages = [// 'af',// 'ar',// 'az',// ...// ]}});$ npm install --save @google-cloud/logging varlogging=require('@google-cloud/logging');See Authentication.
varloggingClient=logging({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Create a sink using a Bucket as a destination.vargcs=storage();loggingClient.createSink('my-new-sink',{destination: gcs.bucket('my-sink')},function(err,sink){});// Write a critical entry to a log.varsyslog=loggingClient.log('syslog');varmetadata={resource: {type: 'gce_instance',labels: {zone: 'global',instance_id: '3'}}};varentry=syslog.entry(metadata,{delegate: process.env.user});syslog.critical(entry,function(err){});// Get all entries in your project.loggingClient.getEntries(function(err,entries){if(!err){// `entries` contains all of the entries from the logs in your project.}});$ npm install --save @google-cloud/firestore constFirestore=require('@google-cloud/firestore');See Authentication.
constfirestore=newFirestore({projectId: 'YOUR_PROJECT_ID',keyFilename: '/path/to/keyfile.json',});constdocument=firestore.doc('posts/intro-to-firestore');// Enter new data into the document.document.set({title: 'Welcome to Firestore',body: 'Hello World',}).then(()=>{// Document created successfully.});// Update an existing document.document.update({body: 'My first Firestore app',}).then(()=>{// Document updated successfully.});// Read the document.document.get().then(doc=>{// Document read successfully.});// Delete the document.document.delete().then(()=>{// Document deleted successfully.});$ npm install --save @google-cloud/pubsub varpubsub=require('@google-cloud/pubsub');See Authentication.
varpubsubClient=pubsub({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Reference a topic that has been previously created.vartopic=pubsubClient.topic('my-topic');// Publish a message to the topic.varpublisher=topic.publisher();varmessage=newBuffer('New message!');publisher.publish(message,function(err,messageId){});// Subscribe to the topic.topic.createSubscription('subscription-name',function(err,subscription){// Register listeners to start pulling for messages.functiononError(err){}functiononMessage(message){}subscription.on('error',onError);subscription.on('message',onMessage);// Remove listeners to stop pulling for messages.subscription.removeListener('message',onMessage);subscription.removeListener('error',onError);});$ npm install --save @google-cloud/spanner varspanner=require('@google-cloud/spanner');See Authentication.
varspannerClient=spanner({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});varinstance=spannerClient.instance('my-instance');vardatabase=instance.database('my-database');// Create a table.varschema=` CREATE TABLE Singers ( SingerId INT64 NOT NULL, FirstName STRING(1024), LastName STRING(1024), SingerInfo BYTES(MAX), ) PRIMARY KEY(SingerId)`;database.createTable(schema,function(err,table,operation){if(err){// Error handling omitted.}operation.on('error',function(err){}).on('complete',function(){// Table created successfully.});});// Insert data into the table.vartable=database.table('Singers');table.insert({SingerId: 10,FirstName: 'Eddie',LastName: 'Wilson'},function(err){if(!err){// Row inserted successfully.}});// Run a query as a readable object stream.database.runStream('SELECT * FROM Singers').on('error',function(err){}).on('data',function(row){// row.toJSON() ={// SingerId: 10,// FirstName: 'Eddie',// LastName: 'Wilson'// }}}).on('end',function(){// All results retrieved.});$ npm install --save @google-cloud/speech varspeech=require('@google-cloud/speech');See Authentication.
varspeechClient=speech({projectId: 'my-project',keyFilename: '/path/to/keyfile.json'});varlanguageCode='en-US';varsampleRateHertz=44100;varencoding=speech.v1.types.RecognitionConfig.AudioEncoding.FLAC;varconfig={languageCode : languageCode,sampleRateHertz : sampleRateHertz,encoding : encoding};varuri='gs://gapic-toolkit/hello.flac';varaudio={uri : uri};varrequest={config: config,audio: audio};speechClient.recognize(request).then(function(responses){varresponse=responses[0];// doThingsWith(response)}).catch(function(err){console.error(err);});$ npm install --save @google-cloud/vision varvision=require('@google-cloud/vision');See Authentication.
varvisionClient=vision({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});vargcsImageUri='gs://gapic-toolkit/President_Barack_Obama.jpg';varsource={gcsImageUri : gcsImageUri};varimage={source : source};vartype=vision.v1.types.Feature.Type.FACE_DETECTION;varfeaturesElement={type : type};varfeatures=[featuresElement];varrequestsElement={image : image,features : features};varrequests=[requestsElement];visionClient.batchAnnotateImages({requests: requests}).then(function(responses){varresponse=responses[0];// doThingsWith(response)}).catch(function(err){console.error(err);});$ npm install --save @google-cloud/bigquery varbigquery=require('@google-cloud/bigquery');See Authentication.
varbigqueryClient=bigquery({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Access an existing dataset and table.varschoolsDataset=bigqueryClient.dataset('schools');varschoolsTable=schoolsDataset.table('schoolsData');// Import data into a table.schoolsTable.import('/local/file.json',function(err,job){});// Get results from a query job.varjob=bigqueryClient.job('job-id');// Use a callback.job.getQueryResults(function(err,rows){});// Or get the same results as a readable stream.job.getQueryResults().on('data',function(row){});It does not follow the conventions you're familiar with from other parts of our library. A handwritten layer is not yet available.
The example below shows you how to instantiate the generated client. For further documentation, please browse the Monitoring .proto files on GitHub.
$ npm install --save @google-cloud/monitoring varmonitoring=require('@google-cloud/monitoring');See Authentication.
varclient=monitoring.metric({// optional auth parameters.});// Iterate over all elements.varformattedName=client.projectPath(projectId);client.listMonitoredResourceDescriptors({name: formattedName}).then(function(responses){varresources=responses[0];for(vari=0;i<resources.length;++i){// doThingsWith(resources[i])}}).catch(function(err){console.error(err);});// Or obtain the paged response.varformattedName=client.projectPath(projectId);varoptions={autoPaginate: false};functioncallback(responses){// The actual resources in a response.varresources=responses[0];// The next request if the response shows there's more responses.varnextRequest=responses[1];// The actual response object, if necessary.// var rawResponse = responses[2];for(vari=0;i<resources.length;++i){// doThingsWith(resources[i]);}if(nextRequest){// Fetch the next page.returnclient.listMonitoredResourceDescriptors(nextRequest,options).then(callback);}}client.listMonitoredResourceDescriptors({name: formattedName},options).then(callback).catch(function(err){console.error(err);});You may need to create an instance to use the Cloud Bigtable API with your project.
$ npm install --save @google-cloud/bigtable varbigtable=require('@google-cloud/bigtable');See Authentication.
varbigtableClient=bigtable({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});varinstance=bigtableClient.instance('my-instance');vartable=instance.table('prezzy');table.getRows(function(err,rows){});// Update a row in your table.varrow=table.row('alincoln');row.save('follows:gwashington',1,function(err){if(err){// Error handling omitted.}row.get('follows:gwashington',function(err,data){if(err){// Error handling omitted.}// data ={// follows:{// gwashington: [//{// value: 1// }// ]// }// }});});$ npm install --save @google-cloud/dns vardns=require('@google-cloud/dns');See Authentication.
vardnsClient=dns({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Create a managed zone.dnsClient.createZone('my-new-zone',{dnsName: 'my-domain.com.'},function(err,zone){});// Reference an existing zone.varzone=dnsClient.zone('my-existing-zone');// Create an NS record.varnsRecord=zone.record('ns',{ttl: 86400,name: 'my-domain.com.',data: 'ns-cloud1.googledomains.com.'});zone.addRecords([nsRecord],function(err,change){});// Create a zonefile from the records in your zone.zone.export('/zonefile.zone',function(err){});$ npm install --save @google-cloud/resource varresource=require('@google-cloud/resource');See Authentication.
varresourceClient=resource({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Get all of the projects you maintain.resourceClient.getProjects(function(err,projects){if(!err){// `projects` contains all of your projects.}});// Get the metadata from your project. (defaults to `grape-spaceship-123`)varproject=resourceClient.project();project.getMetadata(function(err,metadata){// `metadata` describes your project.});$ npm install --save @google-cloud/compute varcompute=require('@google-cloud/compute');See Authentication.
vargce=compute({projectId: 'grape-spaceship-123',keyFilename: '/path/to/keyfile.json'});// Create a new VM using the latest OS image of your choice.varzone=gce.zone('us-central1-a');varname='ubuntu-http';zone.createVM(name,{os: 'ubuntu'},function(err,vm,operation){// `operation` lets you check the status of long-running tasks.operation.on('error',function(err){}).on('running',function(metadata){}).on('complete',function(metadata){// Virtual machine created!});});The source code for the Node.js Cloud Debugger Agent lives in a separate repo.
$ npm install --save @google-cloud/debug-agent require('@google-cloud/debug-agent').start({allowExpressions: true});For more details on API usage, please see the Stackdriver Debug Agent Github Repository.
$ npm install --save @google-cloud/error-reporting The module provides automatic uncaught exception handling, manual error reporting, and integration with common frameworks like express and hapi.
varerrors=require('@google-cloud/error-reporting')();See Authentication.
errors.report(newError('Something broke!'));For more details on API usage, please see the documentation.
The source code for the Node.js Cloud Trace Agent lives in a separate repo.
$ npm install --save @google-cloud/trace-agent vartrace=require('@google-cloud/trace-agent').start();For more details on API usage, please see the Stackdriver Trace Agent Github Repository.
This library follows Semantic Versioning.
Please note it is currently under active development. Any release versioned 0.x.y is subject to backwards-incompatible changes at any time.
GA: Libraries defined at the GA (general availability) quality level are stable. The code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.
Please note that the auto-generated portions of the GA libraries (the ones in modules such as v1 or v2) are considered to be of Beta quality, even if the libraries that wrap them are GA.
Beta: Libraries defined at the Beta quality level are expected to be mostly stable, while we work towards their release candidate. We will address issues and requests with a higher priority.
Alpha: Libraries defined at the Alpha quality level are still a work-in-progress and are more likely to get backwards-incompatible updates.
Contributions to this library are always welcome and highly encouraged.
See CONTRIBUTING for more information on how to get started.
Apache 2.0 - See LICENSE for more information.