Deep Convolutional Neural Networks have demonstrated the is capability of achieving recordbreaking results on highly challenging dataset  \cite{Krizhevsky_2017} . Convolutional networks were initially inspired by biological processes particularly the animal visual cortex where individual cortical neurons overlap to cover the entire visual field perceiving a specific image in a human brain \cite{Hubel_1968,Fukushima_1982}.
Different architectures have been formulated to create ready to use algorithms - trained on the image net dataset ensuring that we are not reinventing the architectures from scratch but only training the last few layers \cite{inproceedings}. This is an efficient process called transfer learning that has revolutionalized how models are created as it solves the problem of insufficient training data \cite{726791,NIPS1989_293,russakovsky2014imagenet,simonyan2014deep,7298594,tan2018survey}. In our case we will use one specific one that is not too complicated but delivers superior results compared to other architectures the RESNET 34 architectures \cite{he2015deep}.
We then set our metrics of measuring how well our model is doing using the error rate - we could also set it to accuracy and there are other metrics options that can be found in the fast.ai library depending on the interest of the user. We then call in the build in cnn-learner from the fast.ai library pass in our data, the resnet architecture and the metrics we are measuring for.
We can then run it for one cycle, passing in the number of epochs that you want the model to run for. Too few or too many epochs can be a problem and it is best to aim for stopping when there is no reduction in error rate or increase in accuracy if using accuracy metrics. We then save the model - this can already have a good enough accuracy for the nest stage of inference and deployment. However, if there is a need to create a more accurate model there are techniques such as unfreezing the model, searching for a suitable learning rate using the build in learning rate finder which is novel to fast.ai. Then using that learning rate you can fit other cycles until the model gets to a suitable accuracy in our case we are able to get to a 90% accuracy - a remarkable feat!

MODEL CREATION

Figure 2: show confusion matrix - look at cancer paper by thrun published in nature
Table 3: show table of results

MODEL TUNING

MODEL SAVING

MODEL INFERENCE(DEVELOPMENT)

III. DEPLOYMENT PHASE

MODEL INFERENCE(PRODUCTION)

DEPLOYMENT

Figure 3: show final product

CONCLUSION

Although there have been many papers prototyping interesting apllications of artificial intelligence, very few outline beginner friendly - non domain approaches to classify species. This paper illustrated the tools, steps and resources required to achieve this feat with little domain knowledge of the subject. Like, the advent of electricity in 19XX(quote edison) - it is the application of the tools that will move humanity forward. This paper gives ecologists working from the most advanced labs in the world to the least resourced labs power to prototype world class classifiers for their specific subjects.

ACKNOWLEDGEMENTS

I would like to thank my advisor Dr. Melissa Pespeni and the QUEST fellowship under grant from NSF bla bla. The figures were made using illustrations from Lucidchart.com and Amazon Web Services icons.