Schedules

Event Schedule

New talks added every week. Please check back this page again.

All timings in IST

Expand All +
  • Day 1

    October 29, 2020

  • This session focuses on adversarial robustness in deep learning. It discusses its importance, different types of adversarial attacks, and presents some approaches to training neural networks with adversarial realization. Abstract Deep learning has brought us tremendous achievements in many fields such as computer vision, natural language processing. In spite of the impeccable success, modern deep learning systems are still prone to adversaries. Let's talk in terms of computer vision. Consider an image of a hog (X1). A Deep learning-based image classifier is able to successfully X1 as a hog. Now consider another instance of the same image X2 which is a slightly perturbed version of X1. To the human eyes, it would still be a hog but for that same image classifier, it can be presented as an airliner. These perturbations are referred to as adversaries. This idea holds equally for Natural Language Processing as well. Consider the following output an entailment model where it is trying to find if two phrases are entailed to each other or not. By changing just one word (highlighted in red) the model changes its prediction from entailed to contradiction. By the end of the session, the attendees will have an understanding of what adversarial perturbations mean in deep learning and how to deal with them with common recipes.
    Workshop

  • This study introduces machine learning (ML) and deep learning (DL) models for predicting self-employment default rates using credit information. Most of preceding studies regarding corporate credit risk often focus on bankruptcy prediction models which involve and target listed companies, where they utilize financial information as main variables and also use macro-economic information as auxiliary variables. However, bankruptcy prediction models are difficult to apply to cases where financial information is insufficient, such as small-and-medium enterprise (SME) and self-employment businesses. In addition, there hardly exist studies on the prediction of corporate default rates by industry and also very limited. We hereby used micro-level variables that were processed by analysis of credit information such as loans and overdue history of individual businesses in Korean manufacturing sector during April 2014 through June 2019, together with typical macro-economic ones, such that we reach to achieve performance enhancement in predicting default rates. We then evaluated the effect which the algorithms such as Ridge, Random Forest (RF), and Deep Neural Network (DNN) make on the performance of the proposed model, i.e. default-rates prediction model for self-employment. In this study, the DNN model is implemented for two purposes, where it is a submodel for the selection of credit information variables, and it also works for cascading to the final model that predicts default rates by receiving the selected input variables. Each consists of 2 and 3 hidden layers, respectively, and each layer again consists of 5 nodes. The activation function, solver, and learning rate were determined through hyper-parameter tuning. As a result, when the credit information variable was used together with the macro-economic variable, the prediction performance was increased by 3.48% points (R2=0.981), compared to the Ridge model using only macro-economic variables, and the DNN performance of the final model was increased by 4.74% points (R2=0.993).
    Tech Talks

  • Businesses today work with multiple data sources. Gleaning accurate insights from these data sources continues to be a priority. Join the session by experts Arpita Sur and Loveesh Bhatt from Ugam, a Merkle company to uncover how data scientists can leverage stacked ensembles to improve performance of predictive models. The session will cover application of ensemble modeling across use cases with a deep dive in its application for computer vision. Attend this session to access some practical tips and learnings on ensemble modeling.
    Tech Talks

  • Indian Manufacturing lagged behind in leveraging the AI driving their productivity and also move the KPI. The connected machines, interdependency of the process in a complex manufacturing environment added by measurement through sensor made the feasibility of AI to find the possibilities to measure it's integrated impact. Formally called industry 4.0 is what on concept called AI enabled manufacturing.
    Tech Talks

  • Building accurate machine learning models is a constant endeavor for data scientists. However, high performance models are easier said than done. Join this hands-on workshop by Ugam’s machine learning expert Loveesh Bhatt to learn how ensemble modeling, built using different python libraries, can improve model accuracy and performance. The live implementation on the widely participated - Titanic Survival Analysis classification problem - will prove how stacked ensembles can outperform traditional algorithms. The workshop would entail building ensembles that are highly customizable at individual algorithm level. In addition, you would learn to quickly experiment with stacked ensembles using some highly automated libraries from h2o such as AutoML.
    Workshop

  • With increasing transactions and avenues of spending money, not only the financial industry but the consumers are also becoming victim to fraud and scams. As per Nilson Report, payments card related fraud losses alone reached more than $28 billion in 2018. In this fast pace ever evolving payments industry, learn how anomaly detection and behavior analytics is helping catch these fraud trends and prevent the bad actors from stealing money.
    Tech Talks

  • When algorithms take over pricing, data is the key to get it right ! Insurers have always relied on data to calculate risk. However, for most Indian insurers, customer data is limited by their distribution networks. This is changing with digital insurance. In her session, Shilpi will talk about how new age insurers like Acko are using data and ML to personalise price for a customer rather than a cohort. If you are a safe driver or follow healthy practices, machine learning can and will help you get the fair price for your insurance. Shilpi will also talk about how seemingly unrelated factors like your favourite colour or phone model influence claiming behaviour.
    Tech Talks

  • Day 2

    October 30, 2020

  • For more details visit https://dldc.adasci.org/workshop/
    Workshop

  • Designing robust and accurate predictive models for stock price prediction has been an active area of research over a long time. While on one side, the supporters of the efficient market hypothesis claim that it is impossible to forecast stock prices accurately, many researchers believe otherwise. There exist propositions in the literature, that have demonstrated that if properly designed and optimized, predictive models can very accurately and reliably predict future values of stock prices. This paper presents a suite of deep learning-based models for stock price prediction. We use the historical records of the NIFTY 50 index listed in the National Stock Exchange (NSE) of India, during the period from December 29, 2008 to May 15, 2020 for building and testing the models. Our proposition includes two regression models built on convolutional neural networks (CNNs), and three long-and-short-term memory (LSTM) network-based predictive models. For the purpose of forecasting the open values of the NIFTY 50 index records, we adopted a multi-step prediction technique with walk-forward validation. In this approach, the open values of the NIFTY 50 index are predicted on a time horizon of one week, and once a week is over, the actual index values are included in the training set before the model is trained again, and the forecasts for the next week are made. We present detailed results on the forecasting accuracies for all our proposed models. The results show that while all the models are very accurate in forecasting the NIFTY 50 open values, the CNN model that uses the previous one week’s data as the input is the fastest in execution and the most accurate in its forecasting performance. On the other hand, the encoder- decoder CNN-LSTM model that uses the previous two weeks’ data as the input, is found to be the least accurate one.
    Tech Talks

  • Machine Learning models are ubiquitous and have shown tremendous promise in the past few years, however, the most important question still remains: is it in production? MLOps empowers data scientists and machine learning engineers to bring together their knowledge and skills to simplify the process of going from model development to release/deployment. This allows practitioners to automate the end to end machine learning lifecycle to frequently update models, test new models, and continuously roll out new ML models alongside your other applications and services. We will be covering how you can get started with MLOps using GH Actions and Azure ML as building blocks.
    Tech Talks

  • Vogo is an automated dock-less scooter rental platform that aims to solve the problem of last- mile connectivity with an on-demand service that enables users to commute instantly. The platform is built with a unique blend of IoT and Bluetooth technology. Application of AI has created impact at multiple levels by making this platform more intelligent, reducing costs and improving user experience. This session aims to go in-depth on a few of these AI-powered engines that combine deep-learning algorithms with IoT data.
    Tech Talks

  • Digital Advertising is a form of advertisement that uses the internet as a medium of reaching out to customers. Advertisers identify websites on the internet that are visited by their potential customers and serve ads by bidding on the ad slots available. Every day, billions of such bids take place in the form of online programmatic auctions where advertisers compete for an ad-slot by bidding for it. The process of identifying where, and to whom an advertiser should serve an ad is referred to as a targeting strategy. Broadly, targeting strategies can fall into 2 buckets: Cookie-based targeting where browser cookies are identified to serve ads to relevant users; Contextual targeting where websites relevant to the advertiser are identified to bid for their ad slots. Due to growing privacy concerns where browsers are taking down cookies and recent regulations like General Data Protection Regulation (GDPR) in Europe, cookie-based targeting has become difficult. With the inevitable deprecation of cookie-based strategies down the line, it becomes paramount to identify sophisticated contextual targeting strategies that can be leveraged by advertisers. This paper proposes a data-driven approach to create a new contextual strategy from web traffic data that segments websites into groups for targeting. Firstly, researched geometric deep learning techniques are used to generate website embeddings i.e. representing the websites in a vector space. These embeddings are clustered into website segments that would be used for optimizing digital advertising. This paper then compares different techniques discussed using a heuristic criterion to identify the most optimal method for vector representation.
    Tech Talks

  • In real-world applications, class imbalance is a very common problem which is encountered in different areas ranging from medical diagnosis to anomaly detection. This imbalanced class distribution makes extracting useful information very challenging for many popular algorithms. In this situation, optimizing the overall accuracy can highly skew the predictions toward majority class label. Consequently, the false positive rate increases. Several methods have been introduced to address this problem; these methods are less effective when minority class has very few examples. The increase in popularity of deep learning framework have led to development of synthetic example generators like generative adversarial network (GAN) and variational autoencoder (VAE). Variational autoencoder is a deep learning based generative modelling technique which uses variational inference for learning data distribution. In this paper we propose a synergistic over-sampling method with a view to generating informative synthetic minority class data by filtering the noise from the over-sampled examples. To generate the synthetic examples, disentangled variational autoencoder is used while the filtering is carried out using game-theory based filtering algorithm, NEATER. This algorithm efficiently handles filtering noisy examples as a non-cooperative game. The experimental results on several real-life imbalanced datasets, taken from UCI and KEEL, prove the effectiveness of the proposed method for binary classification problems.
    Tech Talks

  • Organ failure is one of the leading causes of mortality in today’s world and the urgent need for organ transplants and lack of donors could lead to a global catastrophe shortly. An optimal solution is the use of rapid prototyping techniques in combination with quantum-enabled deep learning CNNs to produce 3D bioartificial “organoids” driven by pluripotent stem cells that would ensure the organ adaptability to the host. The procedure’s accuracy can be enhanced by implementing a set of features such as evolutionary and genetic algorithms.
    Tech Talks

  • Deep learning is evolving in the areas such as knowledge management, gene regulation, genome organization, and mutation effects. It helps in identifying disease symptoms, undiagnosable disease analysis, detecting introgression, estimating historical recombination rates, identifying selective sweeps, and estimating demography of population genetics. Deep learning methods can be used for disease identification, undiagnosable disease analysis, and personalized treatment recommendation datasets which will be in the order of millions. The black boxed deep neural networks can be used to learn from the data sets regarding the disease symptoms, variants, and patient’s health history to develop a deep learning model. Small variations help in identifying patterns for creating deep learning disease models. The size of the input data helps in improving the accuracy of the model. Deep variant method helps in identifying small variations in the patient’s health data. Patient’s health history and knowledge base are used to predict the disease association with symptoms model. Breast Cancer, pneumonia and other diseases can be diagnosed based on the medical images by using CNN algorithms. CNN technique consists of two steps convolution and pooling. These steps help in image reduction to basic features for image classification. Convolution helps in viewing the image in breaking it into small images. A CNN can have multiple convolution and activation layers. Convolution layer acts like a filter by applying dot product of the actual pixel input values and weights assigned. The sum of the output is used for filtering the image pixels. Activation layer which is part of CNN creates a matrix smaller than the actual image. Skin Image Analysis can be done using machine learning and computer vision. The images are analysed for prediction and prevention of the onset of skin disease. Recommendation engine based on AI algorithms are used for personalization of the treatment for skincare problems based on the user skin type
    Tech Talks

  • This paper is about predicting the movement of stock consist of S&P 500 index. Historically there are many approaches have been tried using various methods to predict the stock movement and being used in the market currently for algorithm trading and alpha generating systems using traditional mathematical approaches [1, 2] . The success of artificial neural network recently created a lot of interest and paved the way to enable prediction using cutting-edge research in the machine learning and deep learning. Some of these papers have done a great job in implementing and explaining benefits of these new technologies. Although most these papers do not go into the complexity of the financial data and mostly utilize single dimension data, still most of these papers were successful in creating the ground for future research in this comparatively new phenomenon.
    Tech Talks

Full Day Deep Learning Workshop

The conference will feature a full day hands-on workshop track on Deep Learning. The workshop will provide an overview of deep learning as a broad topic for participants to get started. A certificate of participation will be provided to all attendees of workshop.

More Details

Extraordinary Speakers

Meet top developers, innovators & researchers in the space of computer vision.

  • Early Bird Pass

    Available till 25th Sep
  • Access to all tracks & workshops
  • Access the recorded sessions later
  • Certificate of attendance provided
  • Access to online networking with attendees & speakers
  • Group discount available
  • $25
  • Late Pass

    Available from 17th Oct
  • Access to all tracks & workshops
  • Access the recorded sessions later
  • Certificate of attendance provided
  • Access to online networking with attendees & speakers
  • No Group discount
  • $75