Upgrade to Pro — share decks privately, control downloads, hide ads and more …

AI in the Battle against fakes

AI in the Battle against fakes

Counterfeit products have been a longstanding and growing pain for companies. In addition to impacting company revenue, they damage brand reputation and customer confidence. One of our partners was asked to build a solution for a global electronics brand that can identify counterfeit products by just taking one picture on a smartphone.

In this session we zoom into the building blocks that make this AI solution work. We’ll find out that there is much more to it than just training a convolutional neural network. We look at challenges like how to manage and monitor the AI model and how to build and improve the model in a way that fits your DevOps production chain.

Learn how we used Azure Functions, Cosmos DB and Docker to build a solid foundation. See how we used Azure Machine Learning service to train the models. And find out how we used Azure DevOps to control, build and deploy this state-of-the-art solution.

AI used to only be in the hands of data science PhDs. Now any developer can integrate this exciting technology into their next project. Join us to find out how.

Henk Boelman

October 18, 2019
Tweet

More Decks by Henk Boelman

Other Decks in Technology

Transcript

  1. Henk Boelman Cloud Advocate @ Microsoft AI in the battle

    against fakes HenkBoelman.com @hboelman
  2. Agenda Assignment & challenges Solution architecture Icon detection and validation

    with Custom Vision Logo classification with AMLS Recap and lessons learned
  3. Ingredients Azure Durable Functions Docker Containers Azure Kubernetes Service Cosmos

    DB Azure Blob Storage Umbraco AMLS Cognitive Services Azure DevOps API Management
  4. AI Engine flow Photo of product Pre-Process image Match to

    Product in DB Text AI Logo AI Icon AI End evaluation
  5. Object detection Classification Classification { Icon: Car Genuine: False, Score:

    95% }, { Icon: Truck Genuine: True, Score: 85% } Input Output Genuine: 2% | Fake: 95% Genuine: 85% | Fake: 3% Icon validator API
  6. Label images Export Train Model Deploy Model Test Model Visual

    Object Tagging Tool Custom Vision Service Icon detection Run test-set against model
  7. Label images Export Train Model Deploy Model Test Model Custom

    Vision Service Icon verification Custom Vision Service API Images in 2 folders Genuine / Fake Run test-set against model Image augmentation
  8. Computer Vision API Cut out PHILIPS Pre- Process Predict image

    Return Result Logo classification Icon Validator API
  9. Label images Pre process Train Model Deploy Model Test Model

    Azure Machine Learning Service Logo classification Images in 2 folders Genuine / Fake Run test-set against model Image augmentation
  10. Azure Machine Learning Services A fully-managed cloud service that enables

    you to easily build, deploy, and share predictive analytics solutions.
  11. Datasets – registered, known data sets Experiments – Training runs

    Models – Registered, versioned models Endpoints: Real-time Endpoints – Deployed model endpoints Pipeline Endpoints – Training workflows Compute – Managed compute Datastores – Connections to data Azure Machine Learning Service
  12. Create a workspace ws = Workspace.create( name='<NAME>', subscription_id='<SUBSCRIPTION ID>', resource_group='<RESOURCE

    GROUP>', location='westeurope') ws.write_config() ws = Workspace.from_config() Create a workspace
  13. Create Compute cfg = AmlCompute.provisioning_configuration( vm_size='STANDARD_NC6', min_nodes=1, max_nodes=6) cc =

    ComputeTarget.create(ws, '<NAME>', cfg) Create a workspace Create compute
  14. Create an estimator params = {'--data-folder': ws.get_default_datastore().as_mount()} estimator = TensorFlow(

    source_directory = script_folder, script_params = params, compute_target = computeCluster, entry_script = 'train.py’, use_gpu = True, conda_packages = ['scikit-learn','keras','opencv’], framework_version='1.10') Create an Experiment Create a training file Create an estimator
  15. Submit the experiment to the cluster run = exp.submit(estimator) RunDetails(run).show()

    Create an Experiment Create a training file Submit to the AI cluster Create an estimator
  16. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Demo: Creating and run an experiment
  17. Azure Notebook Compute Target Experiment Docker Image Data store 1.

    Snapshot folder and send to experiment 2. create docker image 3. Deploy docker and snapshot to compute 4. Mount datastore to compute 6. Stream stdout, logs, metrics 5. Launch the script 7. Copy over outputs
  18. Register the model model = Model.register( ws, model_name=‘My Model Name',

    model_path='./savedmodel', description='My cool Model') Create an Experiment Create a training file Submit to the AI cluster Create an estimator Register the model
  19. Create an Experiment Create a training file Submit to the

    AI cluster Create an estimator Register the model Demo: Register and test the model
  20. Project  App.py => run the model as an API

     Dockerfile => Docker Configuration  Requirements.txt => Python packages to be installed  aml_config/config.json => Azure Machine Learning config  downloadmodel.py => Build step to retrieve the latest model
  21. Recap Azure Function for orchestrator Custom Vision for easy models

    Azure Machine learning Service Make services (API) for specific tasks / models GIT / CI / CD / Test everything
  22. You never have enough data Apply DevOps practices from the

    start… Break the problem into smaller pieces Think big, start small, grow fast