How to run Django and Spark application

Issue

I am working on a Spark Application and I want to create a rest API in Django, below is my code

from django.shortcuts import render
from django.http import Http404
from rest_framework.views import APIView
from rest_framework.decorators import api_view
from rest_framework.response import Response
from rest_framework import status
from django.http import JsonResponse
from django.core import serializers
from django.conf import settings
import json
from pyspark import SparkContext, SparkConf, SQLContext


sc = SparkContext()
sql = SQLContext(sc)

df = Sql.read.format("jdbc").options(
        url = "jdbc:mysql://127.0.0.1:3306/demo",
        driver = "com.mysql.cj.jdbc.Driver",
        dbtable = "tablename",
        user = "xyz",
        password = "abc" 
).load()

totalrecords = df.count()


# Create your views here.
@api_view(["GET"])
def Demo(self):
    try:
        a = str(totalrecords)
        return JsonResponse(a,safe=False)
    except ValueError as e:
        return Response(e.args[0],status.HTTP_400_BAD_REQUEST)

I want to know how will I run this code, as I have directly tried “python manage.py runserver” which is not working, so how to run this spark and django with django api and spark-submit with all required spark jar file?

Solution

To run this code you have to use spark submit only,

spark-submit --jars mysql.jar manage.py runserver 0.0.0.0:8000

or

spark-submit manage.py runserver

Answered By – sudomudo

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published