I am an experienced BI developer and architect. I develop end to end Business Soluitons to the customers. Most valuable part of a data warehouse project is data analysis in order to give the most proper solutions to the Business Requirements.
I want to enhanced my knowledge with big data analytics so I will be prepared for era of the data. R is very useful language to analysis the data.
I watched that video to give me a know-how of usage of R from Prototype to Production. It includes ; After prototype an ideal machine learning introducing MetaFlow some lessons learned.
Prototyping Machine Learning Models; Raw Inputs => Explonatary Data Anlysis ==> Processed Data ==> Models ==> Commınicate Results
SO what is after that? Configuration,Data colleciton, Feature Extraction, Data Verification, Machine Resource Management, Analysis tool, Process Management Tools, Serving Infrastructure, Monitoring… But Meachine Learning Code is fairly small in these containers.
So What would an ideal Machin Learning system for R look like?
Moving from Prototype to Production in R
A basic Metaflow Code:
from metaflow import FlowSpec, step
class MyFlow(FlowSpec):
@step
def start(self):
self.x = 0
self.next(self.a, self.b)
@step
def a(self):
self.x += 2
self.next(self.join)
@step
def b(self):
self.x += 3
self.next(self.join)
@step
def join(self, inputs):
self.out = max(i.x for i in inputs)
self.next(self.end)
MyFlow()