fs-path does not seem to be creating correct files/folders Ask Question

All console.logs are running but the folder and files are not being created at all. What am I doing wrong?

you can assume everything with this is defined

  public writeFiles = () => {
    console.log('writing files');
    const folderPath = `~/dev/generated-projects/${this.projectName}/server/api/routes/${this.structure.name}`;
    console.log('folder path', folderPath);
    fsPath.writeFileSync(path.join(folderPath, 'index.ts'), this.indexFile);
    fsPath.writeFileSync(path.join(folderPath, 'routes.ts'), this.routesFile);
    fsPath.writeFileSync(path.join(folderPath, 'helpers.ts'), this.helperFile);
    console.log('create routes for ', this.structure.name);
  }

App base path from a module in NodeJS Ask Question

I’m building a web app in NodeJS, and I’m implementing my API routes in separate modules. In one of my routes I’m doing some file manipulation and I need to know the base app path. if I use __dirname it gives me the directory that houses my module of course.

I’m currently using this to get the base app path (given that I know the relative path to the module from base path):

path.join(__dirname, "../../", myfilename)

Is there a better way than using ../../? I’m running Node under Windows so there is no process.env.PWD and I don’t want to be platform specific anyway.

stream large files in node.js lambda Ask Question

I am new to javascript have written some nodejs code to calculate checksum of files in S3 by streaming using the crypto module. It does fine when the items are small sizes (1-5GB), larger files will timeout because not all the stream data has been consumed by the time lambda timeout is up and end event have not been reached. I am wondering if there are ways to tune this code such that it will handles the big files in about 30gb range? I noticed that in my lambda the cpu memory is barely being fully utilized, each time it only uses about 10% 148mb/1530mb allocated, can I do anything there? Any help is appreciated, thanks!

var AWS = require('aws-sdk');
const crypto = require('crypto');
const fs = require('fs');
const s3 = new AWS.S3();
let s3params = {
    Bucket: 'nlm-qa-int-draps-bucket',
    //Key: filename.toString(),
    Key: '7801339A.mkv',
};
let hash = crypto.createHash('md5');
let stream = s3.getObject(s3params).createReadStream();
stream.on('data', (data) => {
    hash.update(data);
});
stream.on('end', () => {
            var digest = hash.digest('hex');
            console.log("this is md5 value from digest: " + digest);
            callback(null, digest);
            digest = digest.toString().replace(/[^A-Za-z 0-9 .,?""!@#$%^&*()-_=+;:<>/\|}{[]`~]*/g, '');

stream large files in node.js lambda Ask Question

I am new to javascript have written some nodejs code to calculate checksum of files in S3 by streaming using the crypto module. It does fine when the items are small sizes (1-5GB), larger files will timeout because not all the stream data has been consumed by the time lambda timeout is up and end event have not been reached. I am wondering if there are ways to tune this code such that it will handles the big files in about 30gb range? I noticed that in my lambda the cpu memory is barely being fully utilized, each time it only uses about 10% 148mb/1530mb allocated, can I do anything there? Any help is appreciated, thanks!

var AWS = require('aws-sdk');
const crypto = require('crypto');
const fs = require('fs');
const s3 = new AWS.S3();
let s3params = {
    Bucket: 'nlm-qa-int-draps-bucket',
    //Key: filename.toString(),
    Key: '7801339A.mkv',
};
let hash = crypto.createHash('md5');
let stream = s3.getObject(s3params).createReadStream();
stream.on('data', (data) => {
    hash.update(data);
});
stream.on('end', () => {
            var digest = hash.digest('hex');
            console.log("this is md5 value from digest: " + digest);
            callback(null, digest);
            digest = digest.toString().replace(/[^A-Za-z 0-9 .,?""!@#$%^&*()-_=+;:<>/\|}{[]`~]*/g, '');

Integrating Firebase-Admin with Aurelia Ask Question

In trying to implement firebase-admin in my aurelia project, I’ve ran into some issues, I was wondering if someone could try to help me out.

I first installed firebase-admin through the console with:

npm install --save firebase-admin

Then on main.js, I imported it as admin and declared the serviceAccount.

import * as admin from 'firebase-admin';
serviceAccount: './serviceAccountKey.json';

On the same page, inside export funciont configure(aurelia) I put this:

  admin.initializeApp({
    credential: './serviceAccountKey.json',
    databaseURL: 'https://contactmanager-be4d3.firebaseio.com'
  });

I also made sure to reference firebase-admin on the aurelia.json file:

  {
    "name": "firebase-admin",
    "path": "../node_modules/firebase-admin/lib",
    "main": "index"
  },

But when running, I get this error:

{ Error: ENOENT: no such file or directory, open 'C:Usersvasco.bentochapter-3appsrcfs.js'
    at Object.fs.openSync (fs.js:646:18)
    at Object.fs.readFileSync (fs.js:551:33)
    at Object.exports.readFileSync (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibfile-system.js:49:13)
    at fileRead (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildbundled-source.js:83:31)
    at Object.context.fileRead (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:176:18)
    at Object.context.load (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:357:30)
    at Module.load (eval at <anonymous> (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:14:1), <anonymous>:832:29)
    at Module.fetch (eval at <anonymous> (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:14:1), <anonymous>:822:66)
    at Module.check (eval at <anonymous> (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:14:1), <anonymous>:854:30)
    at Module.enable (eval at <anonymous> (C:Usersvasco.bentochapter-3appnode_modulesaurelia-clilibbuildamodro-tracelibloaderLoader.js:14:1), <anonymous>:1173:22)
  errno: -4058,
  code: 'ENOENT',
  syscall: 'open',
  path: 'C:\Users\vasco.bento\chapter-3\app\src\fs.js',
  moduleTree: [ 'firebase-admin/firebase-namespace' ],
  fileName: 'C:/Users/vasco.bento/chapter-3/app/node_modules/firebase-admin/lib/firebase-namespace.js' }

Which makes sense, given fs isn’t inside the lib folder, or any folder inside node_modules/firebase-admin.

firebase-admin folder

I was wondering how I could be able to solve this issue, and the best way to set up firebase-admin with Aurelia.

How to check if fs.read() is empty Ask Question

I need to check whether a file is empty before I can put it in JSON.parse().

if (fs.exists('path/to/file')) { // true
   return JSON.parse(fs.read('path/to/file'));
}

I know that the file exists by fs.exists(), but how can I check if the file contains no strings before I can put it in JSON.parse()?

JSON.parse(fs.read('path/to/file'));

Returns:

SyntaxError: JSON Parse error: Unexpected EOF

Use fs-extra on .bundle file? Ask Question

I’m trying to check if a specific folder within my project contains a .bundle file and if it does, move it somewhere, otherwise use a default. The issue I’m having is that I can’t see to check if the file exists using fs-extra.

This is what I have right now, if it exists or doesn’t, it still logs true.

fs.exists(themeDir + "Lights.bundle").then(() => {
  console.log("true");
}

From what I’ve read it should throw a promise error if the file does not exist.

How do i write the contents of a variable into fs.writeFile Ask Question

I’m fairly new to Node.js and coding in general but I’m trying to
write the value of the variable named parsed into fs.writeFile()

Is there any way to do it because whenever I run this it creates the txt file but it has [object Object]

My apologies in advance if this doesn’t make any sense:

https.get(`https://api.openweathermap.org/data/2.5/weather?zip=${zip},us&APPID=${config.key}`, res => {

    res.on('data', data => {
        body = data.toString();
    });

    res.on('end', () => {
        const parseData = JSON.parse(body);
        fs.writeFile('data.txt', parsedData)
        console.log(parsed_data);
    });

});

module.exports = wrappedAll;

Getting Local File Path from imagePicker (iOS, Objective C) Ask Question

I am struggling to get the file path from imagePicker and sending it to the server.

This is the attempt to pass over the local path client-side:

NSData *webData = UIImagePNGRepresentation(info[UIImagePickerControllerOriginalImage]);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *localFilePath = [documentsDirectory stringByAppendingPathComponent:@"test.png"];
[webData writeToFile:localFilePath atomically:YES];

AFHTTPSessionManager *manager = [AFHTTPSessionManager manager];
[manager POST:url.absoluteString
   parameters:@{@"path": localFilePath}
     progress:nil
      success:^(NSURLSessionDataTask * _Nonnull task, id  _Nullable responseObject) {
... //results always in error

This is the backend code snipped receiving the path key (in JavaScript):

var fs = require('file-system');

file {
 data: fs.readFileSync(path),
}

However, I always get errors like this:

Error: ENOENT: no such file or directory, open ‘/Library/Developer/CoreSimulator/Devices…

Is there a problem in my JS-Code? Or what other approach should I use to get the file path?
I’d be happy if someone could help me out here.

P.S. other popular existing StackOverflow answers on getting the file path induce the same error code.