Author Archives: Raghavendar T S

Apollo GraphQL – Private (Authentication)/Public API using Schema Directives/Annotation

Apollo GraphQL – Private (Authentication)/Public API using Schema Directives/Annotation

This is one of the most common use case where we need to disable authentication for APIs such as Login API (Generate Access Token). Basically we will have all our APIs hosted in single instance of Apollo GraphQL server (We did not use any of the middleware such as Express). There are number of ways to solve this problem. The main idea behind the solution is that we should not throw error from the context.

Context: Do not throw any error from the context

Note: Make sure you build the context only If the Authorization header is present in the HTTP request. Do not assume that the header will always be available to avoid null pointer errors.

const apolloServer: ApolloServer = new ApolloServer({      
    context: async ({ req }) => {
        let context = null;
        try {
            context = //Build Context
            //If Unauthorized, set error context
            // context = {
            //    error: Unauthorized
            //};       
        }
        catch (Error e) {
            context = {
                error: // Anything as you like
            };
        }
        return context;
    });
}

Approach 1: Throw Error from Resolver (Typical Approach)

Resolver Example

export default {
    Query: {
        async testAPI(parent: any, args: any, context: any, info: any): Promise<any> {
            if(context.error) {
                throw context.error;
            }
            //Business Logic
        }
    }
}

You will have to add the above code in each of the resolver to throw the required error back the to the client.

Approach 2: Schema Directives

Authentication Directive

import { GraphQLField } from "graphql";
import { SchemaDirectiveVisitor } from "apollo-server";
export class AuthenticationDirective extends SchemaDirectiveVisitor {
   visitFieldDefinition(field: GraphQLField<any, any>) {
      const { resolve } = field;
      field.resolve = async function (source, args, context, info) {
         if (context.error) {
            throw context.error;
         }
         return resolve.apply(field, [source, args, context, info]);
      };
   }
}

Schema

export default gql`

   directive @authenticate on FIELD_DEFINITION

    extend type Query {
        #AuthenticationDirective will be executed
        persons: PersonInfo! @authenticate

        #AuthenticationDirective will not be executed since the annotation 
        #@authenticate is not added
        login: LoginInfo!
    }
}`

Apollo Server

Add schemaDirectives while initializing the Apollo Server instance.

const apolloServer: ApolloServer = new ApolloServer({      
    schemaDirectives: {
        authenticate: AuthenticationDirective
    },
    context: async ({ req }) => {
        let context = null;
        try {
            context = //Build Context
            //If Unauthorized, set error context
            // context = {
            //    error: Unauthorized
            //};       
        }
        catch (Error e) {
            context = {
                error: // Anything as you like
            };
        }
        return context;
    });
}

Note that we can also move the Authentication logic into the directive instead of having it in context function. There might be better solutions as well. Kindly comment below If any.

Google Sheets – QUERY with WHERE condition from another Sheet/Tab

Google Sheets – QUERY with WHERE condition from another Sheet/Tab

Assume there are 2 tabs/sheets  (Sheet 1 and Sheet 2) in the workbook and the requirement is to query all or specific set of columns from Sheet 2. Select a cell in Sheet 1 and add the following command.

Formula

=query('Sheet 2 - Test Sheet'!A1:E10, "SELECT B, C, D, E WHERE A Matches '" & D34 & "' ", 0)

Details

1. Sheet 2 - Test Sheet is the second tab/sheet.
2. A1:E10 represents selection of the dataset on which we want to run the query.
3. We are selecting columns B, C, D and E from Sheet 2.
4. We are adding a condition where the cell value in D34 of the Sheet 1 matches the column A in Sheet 2.
5. The value 0 is to ignore copying the header from Sheet 2 to Sheet 1.

Solved – mount: wrong fs type, bad option, bad superblock on Linux (AWS EBS EC2)

Solved – mount: wrong fs type, bad option, bad superblock on Linux (AWS EBS EC2)

The problem is that we need to create a file system after which we can mount the block device in the required directory.

Error

mount: wrong fs type, bad option, bad superblock on /dev/xvdf,
       missing codepage or helper program, or other error
       In some cases useful info is found in syslog - try
       dmesg | tail or so.

Solution

1. Execute the following command to get the list of all block devices
lsblk --output NAME,TYPE,SIZE,FSTYPE,MOUNTPOINT,UUID,LABEL

2. Create a directory to mount the block device 
mkdir -p /test/directory

3. Create a file system 
mkfs -t ext4 /dev/xvdf

4. Mount the block device
mount /dev/xvdf /test/directory

5. Unmount the block device (for testing)
umount /dev/xvdf

Docker RabbitMQ – Default Username/Password Environment Variable Not Working – Solved

Docker RabbitMQ – Default Username/Password Environment Variable Not Working – Solved

I was playing around with the official Docker image of RabbitMQ and I was trying to use RABBITMQ_DEFAULT_USER/RABBITMQ_DEFAULT_PASS environment variable to override the default username/password (guest/guest). I noticed that the username/password provided via the environment variables were set in the rabbitmq.conf file. Still I was neither able to access the queues using the API nor the management console. But I was able to login to the management console using the guest username.

Solution: Deleting the data directory and recreating the container solved the issue

Docker DataStax DSE Volume Mount Not Working – Solved

Docker DataStax DSE Volume Mount Not Working – Solved

I was playing around with DataStax DSE Docker image using Docker Compose and noticed that the Docker volumes were not mounted. I was able to solve the issue by using named volume mounts (Docker Compose v3).

Not Working Docker Compose YAML

version: "2"
services:
  dse01:
    volumes:
      - /mnt/dse/data/cassandra:/var/lib/cassandra
      - /mnt/dse/data/spark:/var/lib/spark

Working Docker Compose YAML

version: "3.2"
services:
  dse01:
    volumes:
      - type: bind
        source: /mnt/dse/data/cassandra
        target: /var/lib/cassandra
      - type: bind
        source: /mnt/dse/data/spark
        target: /var/lib/spark

If  you are using Docker run command refer the following syntax

docker run ....
 --mount source=/mnt/dse/data/cassandra,target=/var/lib/cassandra 
.... datastax/dse-server