Getting image from URL, and upload to DB via Meteor Files

I’m trying to build a system that fetches WordPress posts via its API.
I managed to work with the Wordpress API to get to the point where I got the returned objects.
Other string-based data, I can handle it, but i have no idea how to treat the featured image.
The idea is to fetch the featured image (that returns an absolute URL value), download the image, and reupload it to a new ‘Image’ collection.

I managed to get this far:

import { Meteor } from 'meteor/meteor';
import { Jobs } from 'meteor/wildhart:jobs';
import { fetch, Headers } from 'meteor/fetch';

//import UI elements
import message from 'antd/es/message';

//import collections
import { News } from '../imports/api/news';

const username = Meteor.settings.private.WordpressAddUsername;
const pass = Meteor.settings.private.WordpressAppPass;

//tried to convert into base64 but the console complained window is not defined
const blobtobase64 = blob => {
    const reader =  new window.FileReader();
    return new Promise(resolve => {
        reader.onloadend = () => {

async function fetchNews(){
    try {
        const response = await fetch('', {
            headers: new Headers({ 
                'Content-Type': 'application/json',
                'Authorization': "Basic " + Buffer.from(`${username}:${pass}`).toString("base64")
        const newsData = await response.json();
        //inserting new news to the db
        for(var i = 0; i < newsData.length; i++){

            //processing featured image dulu

            let imageURL = newsData[i].jetpack_featured_media_url
            .then(response => response.blob())
            .then(imageBlob => {
                blobtobase64(imageBlob).then(res => {
                    console.log('imageURL64: ',res);

    } catch (err) {

        this.reschedule({ in: { minutes: 2 } })
})'fetchNewsData', { singular: true})

but bumped into ReferenceError: window is not defined

How to actually pull this off?

  1. Fetching the image from the Wordpress API
  2. Uploading the image (not the link to the image) to a new Image collection using Meteor Files.

A couple of things.

First I’d make sure I can save those images without infringing those websites policies. Especially for sites that make money from traffic, they ask you to link to them and prohibit you from copying the content.

It is not feasible to save images to MongoDB. I believe most of us use CDNs for that, which you can also use with Meteor-Files. However, you need to understand what Meteor-Files is for. This package lets you save files in various storages (MongoDB, AWS S3, Dropbox, etc) and it obfuscates the real URL of the file. If you need high security this is the package you need. If you just want to be like Wordpress, you probably only need to upload to a CDN and deliver the content to your user via an edge storage (basically the CDN).

For this case we use a principle called Slingshot where you upload from the client straight to the storage without passing by the Meteor server. However, for some cases, like S3, you would only use the Meteor server to authorize the upload links on the client so that the client gets authorization to write to the storage.

You have a conversation about it here: S3 file upload - is slingshot still usable ? alternatives?

Another issue is … very large photos. It might be the case that you want to control the quantity and quality you save to your storage. Especially in Wordpress it is possible to have uselessly large photos.

For this i use a simple client side image processor:

// define your image limits

const steppedScale = async (img, step) => {
  return await new Promise(resolve => {
    const { maxImageSize } = imageLimits // define this to avoid very large images
    const canvas = document.createElement('canvas')
    const ctx = canvas.getContext('2d')
    const oc = document.createElement('canvas')
    const octx = oc.getContext('2d')

    const ratio = img.width / img.height
    const canvasWidth = Math.min(ratio >= 1 ? maxImageSize : maxImageSize * ratio, img.width)

    // -- stepped scaling --
    canvas.width = canvasWidth // destination canvas size
    canvas.height = canvasWidth * img.height / img.width
    if (img.width * step > maxImageSize) { // For performance avoid unnecessary drawing
      const mul = 1 / step
      let cur = {
        width: Math.floor(img.width * step),
        height: Math.floor(img.height * step)

      oc.width = cur.width
      oc.height = cur.height

      octx.drawImage(img, 0, 0, cur.width, cur.height)

      while (cur.width * step > maxImageSize) {
        cur = {
          width: Math.floor(cur.width * step),
          height: Math.floor(cur.height * step)
        octx.drawImage(oc, 0, 0, cur.width * mul, cur.height * mul, 0, 0, cur.width, cur.height)
      ctx.drawImage(oc, 0, 0, cur.width, cur.height, 0, 0, canvas.width, canvas.height)
    } else {
      ctx.drawImage(img, 0, 0, canvas.width, canvas.height)
    // decimal below gives you image quality (generally set between 0.6 and 0.75). If your image is webp, you don't really need to process it.
    resolve({ image: canvas.toDataURL('image/jpeg', 0.75), size: { width: canvas.width, height: canvas.height } })

const processImages = async images => {
  return Promise.all( => {
      const processedImages = []
      const img = new Image()
      // img.crossOrigin = "Anonymous";
      img.onload = async () => {
        await new Promise(resolve => resolve(steppedScale(img, 0.5))).then(value => processedImages.push(value))
      img.src = URL.createObjectURL(image)
      return processedImages
  ).then(values => values)

export { processImages }

Your uploader might look like this (with processed images). When you save images to DB is good to save their sizes as well. It is really helpful when you need to lazy load them:

const requestUpload = async () => {
   const images = await processImages(uploadedFiles)
      const reMap = () => {
        const destructuredImages = []
        images?.forEach(image => {
          if (image?.[0]?.image) {
            destructuredImages.push({ image: b64ToBlob(image[0]?.image?.slice(23), 'image/jpeg'), size: image?.[0]?.size })

        if (destructuredImages.length === uploadedFiles.length) {
          uploadImage(destructuredImages, `your_cdn_path`)
        setTimeout(reMap, 200)


The uploader: upload your image and receive a filename or path to your uploaded image. Save that together with the image size e.g img = { url: “…”, size: { width: 1024, height: 768 } }

const uploadImage = // … whatever you prefer. Via Meteor-Files, this package or similar GitHub - activitree/s3up-meta or via another Slingshot type of uploader.

The b64ToBlob function:

const b64ToBlob = (b64Data, contentType, sliceSize) => {
  let byteNumbers, i, slice
  let offset = 0
  const byteCharacters = atob(b64Data)
  const byteArrays = []
  sliceSize = sliceSize || 512
  while (offset < byteCharacters.length) {
    slice = byteCharacters.slice(offset, offset + sliceSize)
    byteNumbers = []
    for (i = 0; i < slice.length; ++i) {
    byteArrays.push(new Uint8Array(byteNumbers))
    offset += sliceSize
  return new Blob(byteArrays, { type: contentType })

export default b64ToBlob

Finally, and what you actually asked for, this is an image fetcher based on the new “meteor/fetch” (like your example) if you want to use it as a server side uploader. The Meteor server does the fetching from source and uploads to S3. You can embed this in a method that has a URL as argument and returns a link to the file. In my case this is a server side because, although the S3 name implies a client that is actually a serves side component. You can fetch and upload from a client indeed if you use a client side uploader. Using Fetch on the server side spares you from all the CORS nightmare.

import { S3Client, PutObjectCommand } from '@aws-sdk/client-s3'
import { fetch } from 'meteor/fetch'

const s3 = new S3Client({
  region: process.env.AWS_S3_REGION || 'eu-central-1',
  sslEnabled: true, // optional
  httpOptions: {
    timeout: 6000,
    agent: false
const putFromUrl = async (url, key) => {
  try {
    const response = await fetch(url, { method: 'get' })
    const command = new PutObjectCommand({
      Bucket: process.env.AWS_S3_BUCKET,
      Key: key,
      ContentType: response.headers.get('Content-Type'),
      ContentLength: response.headers.get('Content-Length'),
      // Expires: 'Thu, 15 Dec 2050 04:08:00 GMT',
      CacheControl: 'max-age=8460000',
      Body: await response.buffer() // buffer
    s3.send(command, err => {
      if (err) { console.error('Could not upload photo to S3, ', err) }
  } catch (error) {
    if (error) { // instanceof fetch.AbortError
      console.log('#### request was aborted', error)

To fetch beautiful previews and content, I use this: metascraper - npm

The code runs on the server and not the browser. window only works in the browser. You need a corresponding code that works in node.