How to run Django and Spark application


I am working on a Spark Application and I want to create a rest API in Django, below is my code

from django.shortcuts import render
from django.http import Http404
from rest_framework.views import APIView
from rest_framework.decorators import api_view
from rest_framework.response import Response
from rest_framework import status
from django.http import JsonResponse
from django.core import serializers
from django.conf import settings
import json
from pyspark import SparkContext, SparkConf, SQLContext

sc = SparkContext()
sql = SQLContext(sc)

df ="jdbc").options(
        url = "jdbc:mysql://",
        driver = "com.mysql.cj.jdbc.Driver",
        dbtable = "tablename",
        user = "xyz",
        password = "abc" 

totalrecords = df.count()

# Create your views here.
def Demo(self):
        a = str(totalrecords)
        return JsonResponse(a,safe=False)
    except ValueError as e:
        return Response(e.args[0],status.HTTP_400_BAD_REQUEST)

I want to know how will I run this code, as I have directly tried “python runserver” which is not working, so how to run this spark and django with django api and spark-submit with all required spark jar file?


To run this code you have to use spark submit only,

spark-submit --jars mysql.jar runserver


spark-submit runserver

Answered By – sudomudo

This Answer collected from stackoverflow, is licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0

Leave a Reply

(*) Required, Your email will not be published