Wednesday, 1 April 2020

How to build an eCommerce Business models?


        eCommerce plays an important role in strengthening the relationship and improving the efficiency of our dealings with suppliers and other key trading partners. Investopedia defines as a business model or segment of a large business that enables the firm or individual to conduct business over the network. It operates all four major market segments that includes B2B, B2C, C2B, or C2C (marketplace). From the books to plane tickets, almost any product and services can be offered by eCommerce.  The concept of supply chain management revolves around having the right product in the right place at the right time and the key aspects of supply chain management include,
      * the ability of businesses to exchange information on stock levels
      * fulfill the orders more quickly
      * Minimize excess inventory
      * Use a network infrastructure to ensure good response time and speed
      * Improve customer service
        Having an eCommerce store is great and what you do is really counts. It is to see how you can fit your eCommerce store into your existing business. So, if you have a high street store, then you are in the perfect position to take full advantage of eCommerce. The eCommerce store gives your customers to order the products online which will increase your turnover and give you a much bigger potential audience. The eCommerce store need to synchronize with your physical store so that the inventory is updated when the products are bought in-store or online. There is an electronic point of sale(EPOS) to automatically update the stock without any of your work. This can be a small scale business model and don't need to run the business on a massive scale from day one. You need to identify the type of product you sell or making own products and create a store website needs an small amount of investment in inventory. You can use social media channels or eBay store to increase the sales and use the blog for content marketing. If you are a blogger or marketer, you might have an audience to your existing business and promoting your email list and can sell your products. It is to identify the type or product that matches your niche and choosing an eCommerce platform to suit your store. You can sell digital products like ebooks, and software through the eCommerce store and start using things like apps, and plugins to sell your books in more inventive ways. Embed your e-commerce store into a Facebook page for more direct monetization of social media efforts. For ex, the clothing resale is a simple and easy business model. Here, you are buying stock and reselling it a little more. you may buy the 100 shirts for $500($5 each) and sell for $10 each that gives you $500 profit. It is called keystone pricing. Now, you may choose to keep $200 and reinvest $300 so that you can order $800 next time for $1600 turnover.

Building an eCommerce Store: If the website is confusing, unattractive, then the client has myriad choices when it comes to online shopping. He will choose to purchase somewhere else or he will never return to your website. The end objective of our eCommerce store is to sell more items. So, you need to think carefully about the layout and design your site. There is many considerations and design choices that will encourage sales. If you want to sell the broadest range of people, you need to ensure your site looks official and trustworthy. It needs a professional-looking design and brand with a high-quality logo. You can improve the UX of the website by the following rules,
 1. Use Easy Navigation - It is important to design if the user finds something specific on your website. So, finding the right category is the first step. Categories should reflect the user's mental model - Where they expect to find the item. It should be accessible from the main navigation. The readability decreases if there are many categories. Try to create a couple of main categories and rest as subcategories. The products are categorized as,
     * If the product fits in different categories, place it in each category which makes items easier to find
    * Name of the category should reflect the user's thinking and not the business or technology thinking
    * Make sure the categories are intuitive and easy navigation before going to live

 2. Use the Search Bars - People uses the search bar for what they are looking for - often specific to the product name. Users who utilize the search are often in the late stage of buying mode when they have gathered all of the information they deem necessary and made a decision on which product to buy. So, Place a search bar in the visible place of the website and accessible from every subpage. Follow these rules when designing the search bars like,
   * Users can make mistakes, so support the typos in search queries
   * Display the similar or suggested products when nothing matches the query
   * if possible, search results dynamically

 3. Include the Filters in Searches - Filters are useful when you have hundreds of products in the same category. In order to design the product, you need to understand how users are searching for your products. For ex, the price range filter is important for the customers. When designing a filter, it is important to consider their order and input method. Always select the input method that allows the fastest use depending on the data type and user's context. With dynamic sorting, the results immediately change after a user chooses a filter. With user-command sorting, the user has to confirm the choice, and the results will be shown.

 4. Use the Product lists to display necessary Information - Product lists allow the user to make the decision on which product they need to view. In order to support this process, you need to display the information that helps them to find what they are looking for more quickly. For ex, the price, color variations, size, model type etc., There is another decision to make is to provide an option to add the product to the cart. If the user wants to buy the item without going into more detail, then add an option to quickly add the product to the basket so that the user doesn't need an additional step every time else if the product requires additional properties like color, size of shirt then the user needs to go the product page and update the details.

 5. Design the Easy-to-Scan Product Pages - Product pages should have the critical information to be visible at first sight and everything else should be easily accessible without crowding. Some users will purchase the product on an overview of the information, while others want to know everything about the product before buying. So, you need to accommodate both types of users on your website. The elements that support good product pages are,
   * high-quality product photos from a different angle
   * Visible product pricing
   * Eye Catching CTA button
   * Overview of the Product
Also, you need to think about the color scheme and palette on eCommerce website design. Different colors have different effects on your customers. for ex, the red and orange colors have slightly impatient and this can be used as your advantage. McDonalds store has this color scheme so that people don't comfortable spending a long time eating. It makes the store to accommodate the higher turnover of customers and more profit of the result.

 6. Implement the easy Check-Out Process - The survey shows that people are less likely to buy from the store if they need to set up an account first. In the planning stage of eCommerce project, it is important to understand how the website will be maintained on a day-to-day basis.  Getting the users to the final step of sales is critical. If you want to sell as much as possible, you need to make the process of buying from you should be streamlined and as simple as possible. It should not have any distractions. Breaking the checkout process into smaller steps reduces cognitive load and allows easier progress that makes the process seem faster to complete. It is good to create a buy-with-one-click system. You can use Paypal for the checkout process. This helps more secure and shopping from your site. For the high basket value, it is important to display an order summary at every step because your clients may feel anxious when they pay a lot.  After the order is complete, send an email with an order summary and display it on the website. The order summary should have ordered a product, the amount paid, package arrival date, and delivery address. It is good to allow the user to edit the basket contents for the quantity, link to the product, remove the item from the basket etc.,
            In order to build the application and start selling online, you need third party services. For ex, to accept the payment on the internet, you need to sign up for the Stripe services and special API keys. Stripe is the current standard for developers accepting the payments in the US. The client-side documentation( has javascript library and server-side has node.js documentation ( needs to be implemented. Heroku is the cloud application platform that was acquired by salesforce. It simplifies the task of deploying and maintaining cloud infrastructure. If you are using the Node.js server, you can deploy( using CLI to the internet.

Sunday, 15 March 2020

What is Data Visualization and Machine Learning?


            The good analysis project starts with explanatory visualization that helps you develop a hypothesis and it ends with carefully manicured figures that make the final results obvious. The actual crunching number is hidden in the middle and sometimes set as an aside. Machine learning is the signal in the data and the directions are most promising to further work and show more clarity of the number. The Matplotlib library for data scientists helps to integrates well with other libraries. Pandas is a  wrapper around the Metplotlib and does the image formatting. Data scientists were computer programmers or machine learning experts who were working on big data problems. They were analyzing the datasets of HTML pages, image files, emails, web servers to write the production software that implements the analytics ideas in real-time. He extracts the data from the raw future and works most of the time with getting data and form the statistical method to be applied. Unicorns will construct a good statistical model, hack the quality software and relate this in a meaningful way to business problems.
          AI is the broader term under which the machine learning and deep learning comes in. AI emphasizes the creation of intelligent machines that work and react like humans. The presentation part is the AI and the actual implementation part is machine learning or deep learning. Deep learning is the subset of machine learning which is concerned with algorithms that contain the structure and function of the brain called artificial neural networks. 
          Machine learning is the subset of AI which gives the ability to the machine and learn without explicitly programmed. The data is the key and learning algorithm. The machine trains itself according to the data provided to it. The machine learns from the data sets. When you have the datasets, start dividing the dataset into two parts like train and test. First, 80% of data is the training data and 20% of the data are test data to test our model. So, train the machine with 80% of your data and create the machine learning model. Once the model is created, we bring the test data to the test machine method. If the accuracy is not good enough, we repeat the process again and again to get the final screen and tested model. The more data you provide the more accurate the machine gets and easily able to identify the objects. This how the machine learns. We are surrounded by various Machine Learning Techniques. For ex, Machine Learning forces the product recommendation. When you shop on Amazon while browsing the product, you can notice the list of certain products that are similar to your interest. It is one of the applications of machine learning used in Amazon to build the recommendation engine which recommends a product for you. Whenever you call Alexa, it performs  particular functions like play music which plays favorite music or turns on the light etc., It is one of the machine learning applications to perform these actions. The machine learning application does the traffic prediction and predicts the traffic on that particular route from point A to point B.

Types of Machine Learning: There are 3 types of machine learning. Those are,
  * Supervised Learning
  * Unsupervised Learning
  * Reinforcement Learning
         In Supervised Learning, we have a label training dataset and machine train on that particular label. There is a teacher or supervisor who supervises the entire process and trains the machine according to that label. The use case of supervised learning is the spam classifier. The spam or not spam mails are classified on the filters. These settings are constantly updated based on new technologies, new spam identification and the feedback given by Gmail users for the potential spammers. It uses the text filter to eliminate the tag based on the sender and their history. The text filter uses the algorithms to detect the phrase or words most often using the spamming. Another filtering method is the client filter which blocks malicious or annoying spam emails. It looks all the messages of the certain user on the send out. If the send out has a huge amount of emails constantly or several of the messages marked as spam to the text filter and that case the email will be blocked. This brings the user block list that prevents any inbound messages from the email address going forward.
      In Unsupervised Learning, there is no teacher and no label. The machines identify the objects through clusters in it. It finds similar images and groups them into a cluster. Cluster one consists of a similar item and cluster two consists of other similar items that are not at all related to cluster one. The use case of unsupervised learning is a voice-based personal assistant of Amazon Alexa. It recognizes the word Alexa and sends the recording to the internet to Amazon. This process is called Alexa voice service or AVS.  For ex, If you ask for a time, the AVS sends back an audio file telling you the time which the echo playback. It runs by Amazon and converts the recording into a command that it integrates. It is a simple voice to text service. Amazon also offers sample code for building echo using Raspberry pi. You can set up Philips/ Solimo smart light to be controlled with Alexa to turn on the living room light. Amazon adding more features and skills to Alexa, if you are smart enough to build on your own to control things that are not in the list.
     In Reinforcement Learning, we have an agent and environments. The agent selects some action using some policies. There is no teacher in training the machine. If the machine makes the right decision and gets a positive point. If it makes the wrong decision and gets a negative point for it. Again and again, the machine learns from it. This is the process of reinforcement learning. For ex, In the self-driving car, initially, the car was trained and didn't know what way to choose. So, it makes the action but the action perform was wrong. So, it will get a penalty of 50 points. Next time, the car realizes the past action and updates the policy and iterate the process. This is reinforcement learning. Autonomous or self-driving cars are safer than human-driven cars. It is unaffected by human fatigue, emotions or illness. It is always active and attentive to observing the environments and scanning multiple directions. It is difficult to make a move that the car has not anticipated. It mainly depends on 3 technologies like IOT sensors,  IOT connectivity, and software to guide them. There are many types of sensors in a self-driving car like a sensor for blind-spot monitoring, forward collision warning, radar and ultrasonic. All of this IOT sensor make the best navigation of a self-driving car. Next, the IOT connectivity uses cloud computing to act upon traffic data, maps, weathers, surface conditioning among others. This helps them to monitor their surroundings better and make informed decisions. The car collects all the data and determines the best course of action. In today's world, the Tesla cars analyzing the environment using software known as autopilot. It uses high-tech cameras to view and collect data from the phone which is same as what we do with our eyes. It is called computer vision or sophisticated machine cognition.

Basics of Data Visualization: Everyone uses statistics like average sales per customer, the average height of a class etc., It is a useful summary of the dataset and hides the detail within that data. For ex., take an Anscombe quartet, it is a group of 4 datasets. Each consists of 11 pairs of values x and y. If you want to visualize the data, you need to plot x-one and y-one from the dataset. According to summary statistics, we have 4 datasets and when you visualize, they are completely different. The tables or numbers hide the information within that data. The data visualization helps to explore, understand, and explain our data. These questions can be resolved in data visualization like,
   * How is this data related to that data?
   * How is this data distributed?
   * How is this data made up and how does this data look on map?
      Especially, it depends on What type of answer do I want to find with my data? Comparisons are where you want to see one bit of data against another. For ex, how do sales compare across regions?
The charts are great for comparisons. Bar charts are good for category data. Time series charts will be used to line data when you have date component. Relationships in data involve one or more measurements and examine how the dimensions affect that relationship. For ex., comparison of height, weight and see how that relationship changes across countries etc., If you are looking for patterns and outliners of data, you can do this in scatter chart where the position of data points relates to measuring values for the dimensions in the view. Summary statistics look at the distribution of data and need to see the shape of the data has. Is it clustered around median, bi-model? does it skewer towards higher or lower value? The histogram is a great way to look at the summary of distribution. There are alternative ways of seeing the makeup of data are pie charts, area charts, and treemaps. The good starting point of visualization is to ask yourself what kind of questions do I want to ask? then, make the right choice every time depending on that data. 

Saturday, 29 February 2020

How to setup a python data science project?


           In data science, python3 is a powerful computational tool when working with data. It is used in a small and large organization that handles any tasks related to data analysis to be integrated with web applications or statistics code to be incorporated in the production database. Python emphasizes productivity and code readability. To write programs you need an environment for coding. Jupyter Notebook is a computational platform that will allow you to code and apply your data science skills. You can install in the command line by "pip install jupyterlab" and type "jupyter notebook" which will open your notebook.  R language is used in data analysis tasks required by standalone computing or analysis of individual servers. It focuses on user-friendly data analysis, statistics and graphical models. Datasets are the arrangement of data and can be structured in different ways. Data files are stored in a specific format. The common file format for storing data is comma-separated values or CSV where the record stored as a line and the field is separated by a comma. Pandas library in python has a read_csv method that quickly reads the file into memory. Also, the JSON format is the sort of data that gets exchanged in your data science web applications.

Big Data and Data Science: Big data isn't just bigger or lots of data. It is a qualitatively different approach to solving problems that require qualitatively different methods that are challenging and promise of big data. Big data is different from small data in the following things,
1. Goals: For small data, the goal is a specific, singular goal and trying to accomplish one task by analyzing the data Whereas the big data goals evolve and will redirect over time. You may have one at the starting point, but things can take unexpected directions.
2. Location: The small data is usually one place, or one computer file or one floppy disk. But, big data spread across multiple servers in multiple locations anywhere on the internet.
3. Data Structure and Content: Small data typically structured in a single table like a spreadsheet. But, the Big data can be semi-structured or unstructured across different sources.
4. Data Preparation: Small data usually prepared by the end-user for their own goals. It covers whois putting in, what they accomplish and why it is there. But, the big data is a team sport and can be prepared by many people who are not the end-users. The degree of co-ordination is extraordinarily advanced.
5. Longevity: Small data kept only for a limited time after the project is finished. It doesn't matter to go away after a few years or a few months. Big data are stored perpetually and become part of later projects. So, the future project might be added to existing data, historical data and data from other sources that come in. It evolves over time.
6. Measurements: Small data typically measured in standardized units using one protocol because of one person doing it and all happens one point of time. But, the big data comes in many different formats that are measured in many different units that are gathered with different protocols by different places and different times. There is no assumption of standardization or uniformity.
7. Reproducibility: In small data, the projects can be reproduced. If the data goes bad or missing, you can do it over again. With big data, the replication of data may not be possible or feasible. Bad data are identified by the forensic process and attempts to repair things or may do without it.
8. Stakes Involved: In small data, the risks are generally limited. If the project doesn't work, it usually not catastrophic. In Big data, the risks are enormous because of so much time, efforts are invested in it. It can cost hundreds of millions of dollars and the lost or bad data can doom the project.
9. Introspection /Peculiar Title: It has to do with where the data comes from and identifying the data. Small data are well organized and individual data points are easy to locate and clear metadata where they come from, what the values mean. Big data have many different files and potentially in many different formats. It is difficult to locate the data points that you are looking for. If it is not documented well that will slip through cracks. It is difficult to interpret the data exactly and what each value means.
10. Analysis: In small data, you generally analyze all data in one procedure on one machine. Big data may be broken apart and it needs to be analyzed in several different steps using different methods and combine the results at the end.
          Data scientists follow certain processes and stages. It is called the data science life cycle. The first stage of data science is to formulate a question about the problem you want to solve. The most important part of a data science project is the question itself and the creativity that comes to exploring the question. Then, acquire the data that are relevant to the problem or question. When you collect the sample data, you need to make sure that there's as little bias possible to collect the sample. There are various methods to collect the samples. Those are,
         1. Simple Random Sampling(SRS)
         2. Cluster Sampling
         3. Stratified Sampling
             One of the methods in probability sampling is called simple random sampling. It is a collection of a sample at random without replacement. For ex, the simple random sample of size of two from the population of 6 people, you need write A to F on a slip and place the slip in lot, when you take 2 slips from the lot without looking, you will get any of these samples with equal chances like AF, AE, AD, AC, AB, BF, BE, BD, BC, CF, CE, CD, DF, DE, EF, FF.  The probability sampling is the sampling method that assigns precise probabilities to the appearance of each sample. Another method of probability sampling is cluster sampling. It is taken by dividing data into clusters and using simple random sampling to select clusters. In the previous example, you can form 3 clusters of 2 people per cluster like AB, CD, EF with equal chance. This cluster sampling is an easier sampling collection method and it is used to conduct the survey. But, the disadvantage is that there is more variation in the estimation, so you need to take large samples. Stratified Sampling dividing the data into strata and producing one simple random sample per stratum. In the previous example, You need to divide into 2 strata. Strata one as A,B,C and Strata 2 as D,E,F. Then, you can use SRS to select one person from each strata and you get the sample like AD, AE, AF, BD, BE, BF, CD, CE, CF. In stratified sampling, data cannot be the same size.
       In the third stage, you need to conduct exploratory data analysis(EDA). It is to understand the data you have. Here, you can visualize data to the patterns, issues and anything related to data. Data visualization is an essential tool in data science. The rule of thumb in data science deliverables is that if there isn't a picture, then you're doing it wrong. Machine learning people are needed to know where the data is hidden and which directions were most promising to work. They can convey trends and anomalies of data more efficiently. The 2 important visualization tools in python are Matplotlib and Seaborn. It allows you to create two dimensional and multi-dimensional plots of your data. Also, it helps you to visualize the qualitative and quantitative data. 
When you perform EDA, make sure to apply the following,
        * Avoid the assumptions about the data
        * Examine statistical data types in the data
        * Examine the key properties of data
       This will helps you to find the answer to your question or the problem you want to solve. Finally, you need to use prediction and inference to draw a conclusion from the data. Inferences to quantify how certain the trends they see in their data. And they use inference to draw the conclusion
from the dataset. When you see the trends in data, you need to see the trends that occur due to random fluctuation in data collection or real phenomena. So, hypothesis testing will help you to solve this problem.
      Classification is a machine learning technique that helps the categorical predictions of data. Once, you have the data of correct categories, you need to learn from the data to make predictions in the future. For ex, the weather stations forecast tomorrow's weather from today's and previous days' weather. It is also used to predict if the patient has a particular disease. The situation when you make the prediction is called an observation. Each observation has certain aspects that are called attributes. Once the attributes are identified and the observation belongs to a specific category are defined is called a class. The goal of the classification is to correctly predict the classes of observation using the attributes. You need to go through the process repeatedly for more questions and problems. So, these are all the major stages of data science processes.

Python3 and Raspberry Pi:  Raspberry pi is a single-board computer that has all the parts on a single printed circuit board(PCB) which is the size of the credit card and individual components of SCB are not replaceable or upgradeable. Normal, desktop/laptop computers are not suitable for connecting to the sensors. In Raspberry Pi, there are 40 GPIO(general purpose input and output) Pins that depends on the model and these pins will be used to connect the sensors and any other devices. Using GPIO pins, you can directly program the hardware devices using high-level programming languages like C, C++, shell scripting, python and directly address the hardware devices that you want to do on PC. Here, you can download the Raspberry pi operating system. Just go ahead with Raspian and install the desktop environment software with minimum image-based software Debian buster. You need to use download-accelerator-plus to download the software, else it will not get downloaded properly. Then, follow the Raspian OS setup guide( ) and find the IP address of your Raspberry pi by command line. You can access the graphical user interface and desktop of Raspberry pi by enabling the Virtual Computer Networking VNC . Then, Sign up for real VNC server and download the OS that are installed on your computer. Once you provide the username and password(pi, raspberry), it will open the remote desktop. Now, you can start programming remotely. It is very useful tool when you want to run small python programs for training purposes, data visualization, data science and recording your own goals.

Saturday, 15 February 2020

What are the fundamentals of Cloud Computing and Data Science?


         Data science refers to a collection of related disciplines focusing on the use of data to create new information and technology. It provides useful insights for better decisions. For ex, big data overcomes the challenge of analyzing the huge volume and modern data generated at high speed. In the real world, computing devices such as cellphones, security cameras are constantly generating data and connected to the internet, also known as IOT is ever-growing. Computers can make decisions based on trusted algorithms to make accurate predictions. Data analytics is a more enhanced way of taking advantage of exponentially increasing computing power and storage capacity. You need the basic knowledge of statistics to be a successful data scientist. Basically, the data industry is driven by IT in the languages like python, R which comes with powerful libraries that implement statistical functions and visualization features. The programmer or data scientist automate the necessary tasks and focus on solving large problems. Distributed file system like Hadoop and distributed processing like Spark plays a critical role in big data that enables you to make informed decisions. Machine Learning helps to detect data patterns and make better predictions about a dataset. For ex, In fraud detection, machine learning dramatically reduces the workload by a significant number of data points and presents only the suspicious candidates. The visualization tools can greatly enhance the presentation. Data scientists need to specialize in core job duties in particular area.
          Data science requires support from cloud computing and virtualization for the ever-increasing size, speed and accuracy requirements for the data sets we have to manage. Cloud computing provides the scalability requirement for computing resources. Actually, the cloud provides the processing power and storage space. The software application connects virtual machines through a high-speed network and implements distributed file and processing systems. Hadoop and Spark are the key elements that build on virtual machines. It solves data science problems by connecting the specific data science application. Cloud computing, virtualization, machine learning, and distributed computing are technologies for data scientists to do their job effectively. Proxmos is easy to install for cloud computing and virtualization to build your own cloud and configure the software. Weka is a machine learning tool that allows users to run various machine learning algorithms in a GUI environment.

Fundamentals of Cloud Computing: If you want to familiarize yourself with Azure computing, first you need to familiarize yourself with cloud computing as a whole. There are 3 types of cloud computing. Those are,
  1. Public Cloud
  2. Private Cloud
  3. Hybrid Cloud
       When we talk about the infrastructure, you need to know the infrastructure deployed in your company and you need to manage the server, hardware, services, firewalls managed in your organization by internal administrator who is responsible for the functions and functionalities for the user. The user consumes the services and you need to update or upgrade and manage the hardware that services live on. In a private cloud, the user could be an administrator who has a portal based environment from which they manage the environment, provision servers, deployed applications, websites, and all the things. It depends on the software that manages the private cloud and that exposes all the functionalities in the portal. For ex, the System Center 2012 R2 by Microsoft provides the private cloud infrastructure. It is typically a private data center so that you will be responsible for hardware, software, and network services. The vendor is responsible for most tasks that are performed in the public cloud like Microsoft Azure, Google public cloud, AWS. It uses a leasing base model which is basically pay as you go or use infrastructure that you consume resources of workloads, applications and services. The usage can be the data stored in the cloud infrastructure and services offered by virtual machines. The advantage of public cloud infrastructure is that you can deploy a new application or server at a very low cost and you don't need new hardware to support the additional infrastructure. Ultimately, it reduces the capital expenditure of the company. The Hybrid cloud is the mix of public and private solutions where you can have your own internal private data center, store workloads with some services, and applications into the public cloud. It is more complex to manage because you have to manage both environments with coexistence.

Cloud Computing Services: It is a collection of remote servers connected via computer networks available through internet. Virtualization implements cloud computing. It uses the Hypervisor operating system on which many OS can be installed like Windows and Linux. You can fire up virtual machines and leverage vast resources of cloud provider when the business operation grows gradually and exponentially. It is the flexibility to grow your infrastructure quickly if necessary.  Cloud computing companies specialize in managing server farms and know-how to maximize the profit and minimize the expenses. There are 3 major deployment models of cloud. Those are,
   1. Infrastructure as a Service(IaaS) - The computer lab is an infrastructure that you are trying to use as a service. For ex., If you want 10 PCs, you can use AWS EC2 and start 10PCs and put them in the same network. Now, this is our computer lab.
   2. Platform as a Service(PaaS) - It is the platform to run your code. You can just go to the cloud and tell them which compiler and interpreter you want and can run it. The cloud IDE can used to write and run the code.
   3. Software as a Service(SaaS) - It is self-explanatory. For ex, the google docs and dropbox which gives free storage and this is a cloud service software for your purposes.

Application Migration to Cloud:  A successful migration of the large portfolio requires a couple of things. Those are,

    * Think and Plan strategically and,
    * Rapidly iterate through feedback loops to fix the things that are going wrong

      But, there are a lot of things to consider when migrating to cloud. It includes application architecture, the ability to scale-out, distributed nature etc., When we are migrating the applications to the cloud, we are getting a new architecture that's going to have different properties or different characteristic than traditional systems. The advantage of cloud migration is the ability to do a active-active architecture. It means we run the application in real-time at the same time. One application takes over the other if there's a failure. Here, we are automating the things and the goodness of being in the cloud is worth to the business. The application migration are necessary, because
   * you are selling to the stakeholder that are funding to the cloud migration.
   * It makes the business more agile and delivers the value.
   * We are understanding the applications in wide for the specific needs of the application and looking at the general consensus.
   * We are modernizing the things in moving the database models, technologies, improving the security and governance and leveraging the systems whatever the purpose we need.

The important steps in Cloud Migrations are,                                                                         
Ultimately, there is a bit of trial and error, so set the operation processes and continuous improvement.

Data Migration to Cloud: Data is the highest priority when migrating to the cloud. Basically, data is the business and it is everywhere in enterprise. The data is killer application of cloud computing. We are migrating to the cloud and finding more values in new ways in innovations through running databases, big data systems, predictive analysis, and AI based systems in the cloud. So, the data selection is a critical process to understand which database is bound to which applications, what they're doing, security issues, compliance and performance issues that leads to success. In the Business case, migration, testing, and deployment are the understanding of data. You need to look at the applications that depend on data and do the deployment. Ultimately, the goal of leveraging data is to lowering operational costs, integrating existing data silos to make different databases communicate one another as a single dataset and influence actions and outcomes but not just data so they have the information they need to run the business better. We are not going to move every piece of data that exists on-premises into the cloud. We may move 70% of it and we have to deal with integration with on-premise data stores and those that exist in the cloud. So, make sure to build a solid architectural foundation for success when considering data, avoid duplicate data and data silos. In a real-world scenario, you need to consider the following things when you migrate to cloud,
 * It is necessary to understand the total cost of ownership(TCO) for the first year, second year, five years etc., It includes the TCO for applications, for databases, for cloud instances and ROI. The top 5 TCO/ROI are,
   - Value of Agility
   - Cost to retire selected applications, infrastructure or data centers
   - Changes required to maintain a service level
   - Software costs
   - Organizational transformation costs
 * Ensure that the solid business case exists before the migration can begin and how the technology going to be applied
 * The value metrics or value points that need to be determined like including the agility, compressed time to market, cost savings etc., by which you will be measured against the total cost

Friday, 31 January 2020

What is cloud computing and CRM for client app?


       Cloud Computing is a remotely hosted platform. It has the infrastructure, platform, and application. The infrastructure includes the computer, storage facilities, and network. The network could be centrally located or spread out internationally. On that particular hardware, there is a layer called platform. It is all about the object storage, identity management, run time environment for programs, databases etc., For ex, the various kinds of hosting platforms where you have the basic storage, databases and run time engines etc., There are various applications running on the software side. The platforms and infrastructure are managed by centrally provided companies like AWS, Microsoft Azure etc., Actually, you are logging in to the application through servers by desktop, laptop, tablets and doing your work. Everything can be installed on the cloud and not installed locally. This is called cloud computing.

The sales process in CRM can be defined as,

                        Generate   ------------>  Qualify  ----------------> Close

Qualifying Leads: For a long time, B2B businesses (not just SaaS businesses) have been using the qualifying leads. By qualifying leads, a business makes the sales process more efficient. The sales team highlighting the leads that have the highest likelihood of converting to customers. It is determined by two factors
  1.  Firmographics and,
  2.  Interest
        Firmographics measure the business characteristics of a lead. How big is the company they're with? what is their job title? what industry they are in? Qualifying firmographics hones your sales efforts on your target customer profile. Interest measures the lead's interest in buying your product. The more they do, the more interested they are and more likely they will buy. There are 3 types of leads to focus on. Those are,
    1. Marketing Qualified Leads(MQL)
    2. Sales Qualified Leads(SQL)
    3. Product Qualified Leads(PQL)
     MQL factor in firmographics and interest. They have shown little interest(eg., signed up the opt-in)to marketing initiative and some kind of firmographic criteria. These kinds of leads deserve attention but are not quite ready for a personal touch from a salesperson. SQL has a deep interest and taken actions that justify attention from a salesperson. Typically, SQL has visited your website several times, downloads the whitepaper or requested a demo. In the modern world, the free trials and freemium has changed the deep interest of the qualifying lead. But, people aren't going to pay for a product until they've seen value on it. PQL shows true deep interest by,
   * Using the Product
   * Hitting First value
    Traditional qualification criteria include downloading an ebook or visiting pricing page activities that imply an interest in your offering. The user set-up a trial so they could give your product go. From there, some will love with the features and insightful methodology on their own terms. The ones that continue the setup process for your product end-up hitting the first value. This is how to distinguish themselves as truly interested. It means the lead has found value from using the product and more likely to keep wanting to use it. PQLs are more likely to be long term successful customers.

Business Setup for PQL: To build a proper PQL process, you need to define the criteria that will make someone product qualified and have set of guideline for turning these leads into paying customers. There are 6 steps for setting up PQLs. Those are,
1. Understanding Activation(Rate) and Measurement(Score) - the two key PQL measurement
2. Set up a system for keeping tabs on product data
3. Define Activation Criteria for Product
4. Create engagement scoring engine
5. Rank Activated trials by engagement
6. Make sure sales team access to the data

    There are few steps a new user needs to do to get set up and get the value of your product. They are typically product-specific. For ex, In the modern SaaS company with GSuite Plugin for email collaboration, the activation rate is the percentage of steps completed. If the account or user has done 2 out of 5 steps, they are 40% activated. Once the account is activated, you can dig deeper into their PQL status by looking at their engagement score. This is all about the events or actions a user can take in your product. There are several things one might do in the SaaS product like creating a report, check the payment systems, etc., Suppose you have a user who sets up a trial, he could give your product go. You can figure out 3 or 4 actions that allow a new account or user to experience "first value"? These actions are the activation checklist. Because you are tracking your product data and you'll be able to track activation progress of your accounts. The engagement scoring engine allows you to give each of your users and accounts an engagement score based on how much they use your product and what features they are using. Once you created the engagement scoring model for your product, you need to compare and rank them.
         When the customer asks how they should be designing their PQL process, the first question to ask yourself is,  How complex is your product? More simply, how hard is it for a new user to self-serve their way to first value? It means manual support of intervention from your team. Simple products have a free trial and freemium model as well. Intermediate models are complex than simple, but there is an opportunity for a fair percentage of users to self serve the way to value. The complex products require manual support to get to the promised land. It requires technical implementation, access to data or tools from other departments or deeper domain knowledge for a user to get value. Typically, the complex product has a free trial period, but they can only sell to customers who go through a sales demo with a more hands-on approach.

CRM Basics:  There are 4 basic features in CRM. Those are,
         1. Marketing,
         2. Sales Force,
         3. Customer Support and
         4. Service Automation for internal service
     In business, you need to generate leads, prospects and turn them into customers so that the business will grow in that way. The leads can be generated through online, offline, websites, cold calls, emails, social media etc., How does the business nurture the lead? When you have the lead, the marketing automation sends some promotional material and you can program your emails for the automated campaigns. First, The qualified lead is to be converted. It is the transition from the formal sales process. The qualified lead needs to assign, follow up, gauge interest, gauge intent and identify the opportunities. The converted lead considered to be a sales prospect. Sales are the strategy to exchange goods. At any given time, the multiple prospects will be in different stages. You need to define your sales processes like stages, progress, and the probability of success.  The sales funnel, deals and tracking the sales are managed in the sales force automation. The prospects and stages are defined as a sales pipeline for the business. In the contact center customer support, you will be able to assign various cases to representatives and assign cases to various people to give support and decide when the services are going to happen. In service automation, you can create projects, calendars and automate the features of messages, reminders, and alerts. So, Design something that works for your business. Here is the sales cycle that must be refined over time.
     One of the main utilities in CRM is Analysis. You will be able to create the reports from various sources like leads that are to be converted, budgets, sales, number of support service calls, calls answered, and cases solved. It is important to analyze all the data and construct business decisions out of the state. The Collaboration utility will help you to collaborate with the marketing, sales, HR and accounting Team. Also, you can send emails together or each other to work with the project. It is possible to collaborate in any office and any part of the world with remote login to that place and do the collaboration. Next, the Relational Intelligence is to understand and analyze each and every customer with the number of deals doing with them, how much we spend for our product, product preferences, material they are supplying etc., From the data, you will be able to understand the relation with the customer or vendor and able to predict the future from the past history. Lastly, the customers are coming from various channels like online, offline, telephone etc.,  All the input channels can be collected into a central database that can be categorized and assigned individually. This is called multiple channel integration. For ex. the cloud based SaaS agile CRM automate sales, marketing and service in one platform.
      The CRM works in a way that the leads are coming from verbal communications, telephone marketing, emails that are send to appropriate data classification. First, it is send to composition and insertion of organizational database that will classify into various categories. Once the data has put into organization database that data gets analyzed and disseminate into various departments like the support queries goes to support, leads goes to sales, campaign related queries goes to marketing and management related queries goes to management.

Tuesday, 14 January 2020

What are the Important characteristic of Artificial Intelligence(AI)?

      AI is the development of methods and algorithms that allow computers to behave in an intelligent way. AI was introduced to the scientific community in 1950 by ALAN TURING in computational machinery and intelligence. Turing explained in his article that the machines can think and argued in favor of intelligence in machines that marked the interaction between AI and psychology. He focused on analysis, problems and mentalistic terms. This intended to eliminate the distinction between natural intelligence and artificial intelligence. He has contributed the design of the first computer capable of playing chess and the establishment of the symbolic nature of computing. In 1962, McCarthy and Raphael designed and constructed the mobile robot called "Shakey" that has to challenge the real world in terms of space, movement, time, etc., It initiated the study of cognitive process discussion centered around the problems of mental and internal representation of knowledge, perception, and meaning of the problems. Raphael's basic ideas were to gather in a machine with a capacity to learn from the experience, to model, recognize the visual patterns and manipulate the symbols etc., There are some questions about the performance of intellectual work by some machines to be a truly intelligent function? Maurice Wilkes proposes an argument the need to develop a generalized learning program that enables a computer to learn in any field.
          AI can automate any mechanical processes like calculation, data storage, and processing. There are 3 basic steps in the fundamental process behind making decisions in problem-solving are,
     1. Analysis of the Situation
     2. Logical Reasoning and
     3. Decision
The application has been developed to simulate human behavior through computer systems. Human senses are interrelated with computer systems to build the applications that humans are solving every day. AI program will help the following goals like learning, perception, and problem-solving. Also, it is used in specific areas like diagnosing diseases and driving cars etc., There are 4 types of AI. Those are
    * Systems that act as Humans - For ex., Simulating the human behavior in a given environment
    * Systems that think like Humans - For ex., Machine makes think like humans
    * Systems that act Rationally - For ex., Intelligent behavior that can be created with a computational process
    * Systems that think Rationally - For ex., Focusing on mental faculties that can be emulated in computer models
Basically, we are separating the logical or human and action-oriented or cognitive oriented. And, there are 3 basic domains in AI. Those are,
    * Formal  Domains - It is intended to solve problems using different models like search models, algorithms etc.,
    * Technical Domains - This will be used in scientific-technical knowledge like medical diagnosis, robotics, expert systems etc.,
    * Cognitive Domains - Here, we try to understand the functioning of the human mind and its cognitive functions like reasoning, hearing, talking etc.,

Characteristics of AI: The behavior of the program is not explicitly described by the algorithm. The sequence of steps is influenced by that particular problem present and the program specifies the sequence of steps necessary to solve a given problem. It finds the own way to the solution to the problem. This is called a declarative program.  On the other hand, the program that is not AI follows a algorithm that explicitly defines the rules for a given input variable in any given program. This is called a procedural program. The programs incorporate the factors of relationships of the real world and the domain of the knowledge in which they operate is called knowledge-based reasoning. In accounting, the AI collects the knowledge and more adaptive towards solving new types of problems. AI's will not work poorly structured problems and data.
     Scope:  The machines solve the problem that was not defined before. It does not solve the specific problems of the second-degree equation. It's a method that creates a system capable of finding methods to solve the problem. It goes a higher level of understanding and solves a broader variety of problems.
    Perception: The machines able to react to the environment and influence through sensors and interaction devices with outside. The perception of image synthesizers that a lot of computer communicate through spoken language on the walls and not recently has done before.
  Communication: It communicates with the computers through an assembly language or high-level specific language. AI by means of understanding the nature of language that we humans can speak particular problems. Traditional languages have not been well adapted to AI applications. AI applications are using the Java, Python etc.,
  Expert Systems: It consists of large knowledge bases created to store the information available to human experts in various fields and a series of rules manipulation expressed in the specific language. For ex, computer design, medical diagnosis, chemical engineering provides the material of highly successful expert systems.
   AI Hardware: AI technique came fast access to memory banks and they are huge compared to other types of programs. Also, they are fast manipulation of data. The more advanced the hardware, it is easier to work on AI program.
   Robotics: The science of robotics involves different AI techniques. The idea of a ready robots with the ability to learn from experience is the central theme of AI and research. The robot communicates with the natural language and must perform the task equivalent of machine and origin.

Applications of AI: AI is a combination of technique and algorithms that have the purpose of creating machines with the same capabilities that human being has. We have logical reasoning, presentation of knowledge, planning, general intelligence, natural language processing, perception, and many others. For ex, Siri from Apple social network, Pinterest and google photos are an example of AI. Everything is possible with AI. Here are the practical examples of AI,
   * In agriculture, it simplifies and accelerates decision making. Also, the best time to plant and harvest. There are thousands of platforms using market analysis on the information such as information about soil seeds, climate changes and analyzing all the information, we can predict what will happen and the best results.
   * In logistics and transportation, there are self-driving cars that can be driven by robots in the modern world.
   * In health and biotechnology, there is a faster and accurate diagnosis of health. AI helps physicians and patients have a faster and more accurate diagnosis. One of the most important aspects of decision making is the detection of different diseases or taking information from blood samples and other types of material to be analysed. AI provides key insights into the data provided by patients. By analyzing the data, we can make better and faster solutions to the existing problems.
   * In Marketing, AI makes the sales forecast and choosing the right product to recommend the particular cost into a particular customer etc., There is an excellent application in the retail sector is inventory optimization where AI forecasts incomes and determines how much input should be purchased.
  * In education, AI suggests a new course, personalize the course for optimized learning and promote education. Also, it allows us to help and build an online education system.
  * In financial services, the financial institutions recognize the risk of a customer and predict the market patterns, consequences and recommend the operations.
  * In manufacturing and supply chain, AI conducting the study of the product and parts that require maintenance. It helps the manufacturing companies about when to buy or produce as well as predicts the impacts and risks for the supplier.
  * In Banks, the personal assistance helps to perform some digital operations and answer to the questions which streamline the attention to the public.
        There are some great websites provides AI services that automate the workforce using AI tools. The AI software solves the hassle of scheduling meetings and appointments. It solves all the problems you face when you schedule the meetings and appointments to work. It uses Amy and Andrew that understand the natural language and help you to schedule, negotiate the meeting with your clients. The Octaneai makes the sales through Facebook messenger and increases your revenue that your customer will love. This chatbot kind of thing works on the abandoned messages like people purchasing the product and pitches the product. It can turn out automatically as a question that will help to turn a revenue and customer behavior for the results. The Datarobot provides complete automation for machine learning. It automatically builds and evaluate thousands of models and manage all the deployed models and data sets. The Presenceai helps B2B customers and replace the calls to text. It automates the recurring tasks like sending and booking the reminders confirmation etc., The smith is a receptionist service that automatically captures the website tag and books the leads or clients and builds the relationship with existing clients. The Codota is the ultimate java developer that completes millions of code. You will be able to code examples by click, follow standards, reduce errors in java. The Zenbo is a small robot that can speak, connect, learn, express. The vspatial is the workplace of the future for remote access of your PC and collaborate with the participants.