BLOG

BLOG

  • 분류 전체보기 (24)
    • GPT (0)
    • 머신러닝 (1)
    • Analysis (5)
      • Cloomber (0)
      • Big Data & Machine Learning (2)
      • Azure Data Explorer (0)
      • Hadoop (3)
    • Database (11)
      • DYNAMODB (0)
      • SQL (0)
      • MYSQL (4)
      • SSRS (0)
      • MSSQL (7)
      • Redis (0)
    • Infra (5)
      • EKS (0)
      • Airflow (2)
      • Linux (1)
      • ELK (0)
      • AWS (2)
      • DOCKER (0)
    • Development (0)
      • Spring (0)
      • Python (0)
      • C# (0)
      • Network Programming (0)
    • NSIS (0)
    • Pics (1)
  • 홈
  • 태그
  • 미디어로그
  • 위치로그
  • 방명록
  • DB GUIDE
  • SQLER
  • DATABASE SARANGNET
RSS 피드
로그인
로그아웃 글쓰기 관리

BLOG

컨텐츠 검색

태그

python monitoring efk hadoop analysis dataproc docker fluentd Batch MYSQL eks BigData Spring MSSQL airflow AWS java GCP machinelearning Kubernetes

최근글

댓글

공지사항

아카이브

python(1)

  • [Dataproc] Pyspark Job with Airrflow (Composer): Get Data From MySQL

    UBUNTU CLIENT CONFIGURATION export TEMPLATE_ID=workflow-mytest export WORK_CLUSTER_NAME=cluster-mytest export REGION=asia-northeast3 export BUCKET_NAME=jay-pyspark-mytest #airflow task dag에서도 필요 export PROJECT_ID= #airflow task dag에서도 필요 export PYTHON_FILE=pyspark-job.py export STEP_ID=first_step #Some name like "Get Data" PYTHON CODE $vi pywork.py import pymysql import sys import pandas as pd f..

    2021.11.01
이전
1
다음
티스토리
© 2018 TISTORY. All rights reserved.

티스토리툴바