{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Sujet 7 : Autour du SARS-CoV-2 (Covid-19)\n", "\n", "\n", "## Consignes :\n", "\n", "### Prérequis\n", "\n", "Techniques de présentation graphique. Cet exercice peut être réalisé indifféremment en R ou en Python.\n", "\n", "### Sujet\n", "\n", "Le but est ici de reproduire des graphes semblables à ceux du South China Morning Post (SCMP), sur la page The Coronavirus Pandemic et qui montrent pour différents pays le nombre cumulé (c'est-à-dire le nombre total de cas depuis le début de l'épidémie) de personnes atteintes de la maladie à coronavirus 2019.\n", "\n", "Les données que nous utiliserons dans un premier temps sont compilées par le Johns Hopkins University Center for Systems Science and Engineering (JHU CSSE) et sont mises à disposition sur GitHub. C'est plus particulièrement sur les données time_series_covid19_confirmed_global.csv (des suites chronologiques au format csv) disponibles à l'adresse : https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_global.csv, que nous allons nous concentrer.\n", "\n", "Vous commencerez par télécharger les données pour créer un graphe montrant l’évolution du nombre de cas cumulé au cours du temps pour les pays suivants : la Belgique (Belgium), la Chine - toutes les provinces sauf Hong-Kong (China), Hong Kong (China, Hong-Kong), la France métropolitaine (France), l’Allemagne (Germany), l’Iran (Iran), l’Italie (Italy), le Japon (Japan), la Corée du Sud (Korea, South), la Hollande sans les colonies (Netherlands), le Portugal (Portugal), l’Espagne (Spain), le Royaume-Unis sans les colonies (United Kingdom), les États-Unis (US).\n", "\n", "Le nom entre parenthèses est le nom du « pays » tel qu’il apparaît dans le fichier time_series_covid19_confirmed_global.csv. Les données de la Chine apparaissent par province et nous avons séparé Hong-Kong, non pour prendre parti dans les différences entre cette province et l'état chinois, mais parce que c'est ainsi qu'apparaissent les données sur le site du SCMP. Les données pour la France, la Hollande et le Royaume-Uni excluent les territoires d'outre-mer et autres « résidus coloniaux ».\n", "\n", "Ensuite vous ferez un graphe avec la date en abscisse et le nombre cumulé de cas à cette date en ordonnée. Nous vous proposons de faire deux versions de ce graphe, une avec une échelle linéaire et une avec une échelle logarithmique.\n", "\n", "### Question subsidiaire à faire quand on sera sorti du « merdier »\n", "\n", "Vous pourrez également utiliser les données de décès (timeseriescovid19deathsglobal.csv) et refaire les courbes, mais là encore, faites attention lors de l'interprétation. Ces courbes, même si elles paraissent effrayantes, doivent être comparées à la mortalité « normale ». Pour la France des données sont disponibles sur le site de l'INSEE : https://www.insee.fr/fr/information/4470857, ainsi que dans les « Points hebdomadaires » de surveillance de la mortalité diffusés par Santé publique France, comme celui de la semaine 12 (le site étant très mal conçu pour quiconque souhaite une information spécifique, le plus simple est de passer par un moteur de recherche généraliste…).\n", "\n", "Pour atténuer les effets dus aux méthodes de comptage, etc., vous pourrez, une fois l'épidémie terminée, prendre les données du nombre total de décès et les normaliser pour 1000 habitants du pays concerné. Vous irez ensuite chercher les données sur le nombre de lits d'hôpital pour 1000 habitants sur le site de l'OCDE et vous pourrez corréler les deux (c'est-à-dire, faire un graphe avec le nombre de lits en abscisse et le nombre de décès en ordonnée)…\n", "\n", "---\n", "\n", "# Pour commencer\n", "\n", "Nous allons tout d'abord importer les outils necéssaires pour réaliser ce travail.\n", "\n" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [], "source": [ "%matplotlib inline\n", "\n", "# pour la vérification de la présence des données\n", "import os\n", "# pour télécharger les données\n", "import urllib.request\n", "\n", "# pour l'affichage des graphiques\n", "import matplotlib.pyplot as plt\n", "# pour le traitement des données\n", "import pandas as pd" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Nous pouvons maintenant télécharger les [données](https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_global.csv) si elles ne sont pas déjà téléchargés." ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
Province/StateCountry/RegionLatLong1/22/201/23/201/24/201/25/201/26/201/27/20...2/23/212/24/212/25/212/26/212/27/212/28/213/1/213/2/213/3/213/4/21
0NaNAfghanistan33.9391167.709953000000...55646556645568055696557075571455733557595577055775
1NaNAlbania41.1533020.168300000000...102306103327104313105229106215107167107931108823109674110521
2NaNAlgeria28.033901.659600000000...112279112461112622112805112960113092113255113430113593113761
3NaNAndorra42.506301.521800000000...10739107751079910822108491086610889109081094810976
4NaNAngola-11.2027017.873900000000...20584206402069520759207822080720854208822092320981
\n", "

5 rows × 412 columns

\n", "
" ], "text/plain": [ " Province/State Country/Region Lat Long 1/22/20 1/23/20 \\\n", "0 NaN Afghanistan 33.93911 67.709953 0 0 \n", "1 NaN Albania 41.15330 20.168300 0 0 \n", "2 NaN Algeria 28.03390 1.659600 0 0 \n", "3 NaN Andorra 42.50630 1.521800 0 0 \n", "4 NaN Angola -11.20270 17.873900 0 0 \n", "\n", " 1/24/20 1/25/20 1/26/20 1/27/20 ... 2/23/21 2/24/21 2/25/21 \\\n", "0 0 0 0 0 ... 55646 55664 55680 \n", "1 0 0 0 0 ... 102306 103327 104313 \n", "2 0 0 0 0 ... 112279 112461 112622 \n", "3 0 0 0 0 ... 10739 10775 10799 \n", "4 0 0 0 0 ... 20584 20640 20695 \n", "\n", " 2/26/21 2/27/21 2/28/21 3/1/21 3/2/21 3/3/21 3/4/21 \n", "0 55696 55707 55714 55733 55759 55770 55775 \n", "1 105229 106215 107167 107931 108823 109674 110521 \n", "2 112805 112960 113092 113255 113430 113593 113761 \n", "3 10822 10849 10866 10889 10908 10948 10976 \n", "4 20759 20782 20807 20854 20882 20923 20981 \n", "\n", "[5 rows x 412 columns]" ] }, "execution_count": 2, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# URL des données\n", "data_url = \"https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_global.csv\"\n", "\n", "# Nom du fichier csv\n", "data_file = \"data_covid.csv\"\n", "\n", "# Téléchargement des données si elles ne sont pas déjà présentes dans le répertoire\n", "if not os.path.exists(data_file):\n", " urllib.request.urlretrieve(data_url, data_file)\n", "\n", "# Affichage des données\n", "raw_data = pd.read_csv(data_file)\n", "raw_data.head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Ces données sont organisés sur 274 lignes pour chaque pays et (pour le moment) 412 colonnes présentant la province, le pays, les latitudes et longitudes suivi du nombre de cas par jour du 22 Janvier 2020 au 4 Mars 2021 au moment de l'écriture de ce rapport.\n", "\n", "Pour vérifier qu'aucune date pour aucun pays est manquante, filtrons les données en choisant les colonnes des dates avec la regex `\\d{1,2}\\/\\d{1,2}\\/\\d{2}` qui sélectionne les colonnes dont le nom commence par :\n", "un ou deux chiffres suivi d'un \"/\", deux fois, puis se terminent par deux chiffres\n", "c'est à dire les colonnes des dates (vous pouvez inverser la regex avec `[^\\d{1,2}\\/\\d{1,2}\\/\\d{2}]` pour constater que la commande fait bien l'inverse et nous rend des données avec des NaN)" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
Province/StateCountry/RegionLatLong1/22/201/23/201/24/201/25/201/26/201/27/20...2/23/212/24/212/25/212/26/212/27/212/28/213/1/213/2/213/3/213/4/21
\n", "

0 rows × 412 columns

\n", "
" ], "text/plain": [ "Empty DataFrame\n", "Columns: [Province/State, Country/Region, Lat, Long, 1/22/20, 1/23/20, 1/24/20, 1/25/20, 1/26/20, 1/27/20, 1/28/20, 1/29/20, 1/30/20, 1/31/20, 2/1/20, 2/2/20, 2/3/20, 2/4/20, 2/5/20, 2/6/20, 2/7/20, 2/8/20, 2/9/20, 2/10/20, 2/11/20, 2/12/20, 2/13/20, 2/14/20, 2/15/20, 2/16/20, 2/17/20, 2/18/20, 2/19/20, 2/20/20, 2/21/20, 2/22/20, 2/23/20, 2/24/20, 2/25/20, 2/26/20, 2/27/20, 2/28/20, 2/29/20, 3/1/20, 3/2/20, 3/3/20, 3/4/20, 3/5/20, 3/6/20, 3/7/20, 3/8/20, 3/9/20, 3/10/20, 3/11/20, 3/12/20, 3/13/20, 3/14/20, 3/15/20, 3/16/20, 3/17/20, 3/18/20, 3/19/20, 3/20/20, 3/21/20, 3/22/20, 3/23/20, 3/24/20, 3/25/20, 3/26/20, 3/27/20, 3/28/20, 3/29/20, 3/30/20, 3/31/20, 4/1/20, 4/2/20, 4/3/20, 4/4/20, 4/5/20, 4/6/20, 4/7/20, 4/8/20, 4/9/20, 4/10/20, 4/11/20, 4/12/20, 4/13/20, 4/14/20, 4/15/20, 4/16/20, 4/17/20, 4/18/20, 4/19/20, 4/20/20, 4/21/20, 4/22/20, 4/23/20, 4/24/20, 4/25/20, 4/26/20, ...]\n", "Index: []\n", "\n", "[0 rows x 412 columns]" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "raw_data[raw_data.filter(regex=\"\\d{1,2}\\/\\d{1,2}\\/\\d{2}\").isnull().any(axis=1)]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Le tableau en sortie est bien vide, il ne manque donc aucune donnée sur le nombre de cas par jour.\n", "\n", "Listons les pays que nous allons analyser, ils sont tous représentés dans les données par le nom du pays dans la colonne `Country/Region` et `Nan` dans la colonne `Province/State` sauf dans le cas de la Chine ou nous allons devoir faire une somme de toutes les provinces d'un coté et de récupérer Hong-Kong de l'autre." ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "countries = [ \n", " \"Belgium\",\n", " \"France\",\n", " \"Germany\",\n", " \"Iran\",\n", " \"Italy\",\n", " \"Japan\",\n", " \"Korea, South\",\n", " \"Netherlands\",\n", " \"Portugal\",\n", " \"Spain\",\n", " \"United Kingdom\",\n", " \"US\",\n", "]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Enregistrons les lignes consernées dans un nouveau tableau en ajoutant la Chine (nous utiliserons les coordonnées de Beijing pour la longitude et latitude de la Chine, même si l'information ne sera pas réutilisée pour le moment plus tard) :" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "data_list = []\n", "# récupération des données pour tous les pays sauf la Chine\n", "for country in countries:\n", " data_list.append(raw_data[(raw_data['Province/State'].isnull()) & (raw_data['Country/Region']==country)].values.tolist()[0])\n", "# récupération des données pour Hong-Kong\n", "data_list.append(raw_data[(raw_data['Province/State'] == \"Hong Kong\") & (raw_data['Country/Region']== \"China\")].values.tolist()[0])\n", "# récupération des données pour le reste de la Chine en sommant les différentes colonnes\n", "data_china = raw_data[(raw_data['Province/State'] != \"Hong Kong\") & (raw_data['Country/Region']== \"China\")].sum().values.tolist()\n", "# mise à jour des premières colonnes des données de la Chine avec les coordonées de Beijing et ajout à la liste\n", "data_china[0] = pd.np.NaN\n", "data_china[1] = \"China\"\n", "data_china[2] = 40.1824\n", "data_china[3] = 116.4142\n", "data_list.append(data_china)\n", "# création du nouveau tableau avec toutes les données des pays recherchés\n", "data = pd.DataFrame(data_list, columns=raw_data.columns)" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
Province/StateCountry/RegionLatLong1/22/201/23/201/24/201/25/201/26/201/27/20...2/23/212/24/212/25/212/26/212/27/212/28/213/1/213/2/213/3/213/4/21
0NaNBelgium50.8333004.469936000000...757696760809763885766654769414771511772294774344777608780251
1NaNFrance46.2276002.213700002333...3608271363950136640503689034371247437324263736390375924737853263810605
2NaNGermany51.16569110.451526000001...2405263241603724270692436506244417724502952455569246206124729132484306
3NaNIran32.42790853.688046000000...1590605159887516070811615184162315916311691639679164817416566991665103
4NaNItaly41.87194012.567380000000...2832162284856428684352888923290782529252652938371295543429762742999119
5NaNJapan36.204824138.252924222244...426828427732428816429873431093432090432778433700434944436093
6NaNKorea, South35.907757127.766922112234...88120885168892289321896769003190372908169124091638
7NaNNetherlands52.1326005.291300000000...1064598106896010739711079084108402110886901092452109643311014301105544
8NaNPortugal39.399900-8.224500000000...799106800586801746802773803844804562804956805647806626807456
9NaNSpain40.463667-3.749220000000...3161432317064431802123188553318855331885533204531313018431363213142358
10NaNUnited Kingdom55.378100-3.436000000000...4134639414457741545624163085417051941765544182009418840041947854201358
11NaNUS40.000000-100.000000112255...28234656283090852838649228463190285273442857854828637313286940712875998028827144
12Hong KongChina22.300000114.200000022588...10896109131092610950109831100511019110321104611055
13NaNChina40.182400116.414200548641918140120672869...89911899198992589935899418996089971899818999190000
\n", "

14 rows × 412 columns

\n", "
" ], "text/plain": [ " Province/State Country/Region Lat Long 1/22/20 1/23/20 \\\n", "0 NaN Belgium 50.833300 4.469936 0 0 \n", "1 NaN France 46.227600 2.213700 0 0 \n", "2 NaN Germany 51.165691 10.451526 0 0 \n", "3 NaN Iran 32.427908 53.688046 0 0 \n", "4 NaN Italy 41.871940 12.567380 0 0 \n", "5 NaN Japan 36.204824 138.252924 2 2 \n", "6 NaN Korea, South 35.907757 127.766922 1 1 \n", "7 NaN Netherlands 52.132600 5.291300 0 0 \n", "8 NaN Portugal 39.399900 -8.224500 0 0 \n", "9 NaN Spain 40.463667 -3.749220 0 0 \n", "10 NaN United Kingdom 55.378100 -3.436000 0 0 \n", "11 NaN US 40.000000 -100.000000 1 1 \n", "12 Hong Kong China 22.300000 114.200000 0 2 \n", "13 NaN China 40.182400 116.414200 548 641 \n", "\n", " 1/24/20 1/25/20 1/26/20 1/27/20 ... 2/23/21 2/24/21 \\\n", "0 0 0 0 0 ... 757696 760809 \n", "1 2 3 3 3 ... 3608271 3639501 \n", "2 0 0 0 1 ... 2405263 2416037 \n", "3 0 0 0 0 ... 1590605 1598875 \n", "4 0 0 0 0 ... 2832162 2848564 \n", "5 2 2 4 4 ... 426828 427732 \n", "6 2 2 3 4 ... 88120 88516 \n", "7 0 0 0 0 ... 1064598 1068960 \n", "8 0 0 0 0 ... 799106 800586 \n", "9 0 0 0 0 ... 3161432 3170644 \n", "10 0 0 0 0 ... 4134639 4144577 \n", "11 2 2 5 5 ... 28234656 28309085 \n", "12 2 5 8 8 ... 10896 10913 \n", "13 918 1401 2067 2869 ... 89911 89919 \n", "\n", " 2/25/21 2/26/21 2/27/21 2/28/21 3/1/21 3/2/21 3/3/21 \\\n", "0 763885 766654 769414 771511 772294 774344 777608 \n", "1 3664050 3689034 3712474 3732426 3736390 3759247 3785326 \n", "2 2427069 2436506 2444177 2450295 2455569 2462061 2472913 \n", "3 1607081 1615184 1623159 1631169 1639679 1648174 1656699 \n", "4 2868435 2888923 2907825 2925265 2938371 2955434 2976274 \n", "5 428816 429873 431093 432090 432778 433700 434944 \n", "6 88922 89321 89676 90031 90372 90816 91240 \n", "7 1073971 1079084 1084021 1088690 1092452 1096433 1101430 \n", "8 801746 802773 803844 804562 804956 805647 806626 \n", "9 3180212 3188553 3188553 3188553 3204531 3130184 3136321 \n", "10 4154562 4163085 4170519 4176554 4182009 4188400 4194785 \n", "11 28386492 28463190 28527344 28578548 28637313 28694071 28759980 \n", "12 10926 10950 10983 11005 11019 11032 11046 \n", "13 89925 89935 89941 89960 89971 89981 89991 \n", "\n", " 3/4/21 \n", "0 780251 \n", "1 3810605 \n", "2 2484306 \n", "3 1665103 \n", "4 2999119 \n", "5 436093 \n", "6 91638 \n", "7 1105544 \n", "8 807456 \n", "9 3142358 \n", "10 4201358 \n", "11 28827144 \n", "12 11055 \n", "13 90000 \n", "\n", "[14 rows x 412 columns]" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Nous obtenons ainsi notre tableau avec 14 lignes contenant les informations que nous voulons afficher. Nous allons maintenant générer les graphiques." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.4" } }, "nbformat": 4, "nbformat_minor": 2 }