Recipe – Coconut Jelly (Chinese bubble tea shops’ version)

Ingredients:

  • 475g coconut water (Suggest buying 1L bottle from the super market)
  • 100g whole fat milk
  • 50g heavy whipped cream (you can replace this with milk itself if want it less creamy)
  • 150g condensed coconut milk (Do not refrigerate. Otherwise, it will solidify. )
  • 7g gelatin powder
  • 40-60g sugar (adjust to taste)
  • 5.5 oz portion cups. (you can make 8 cups per batch)

Instructions:

  1. In a large cooking pot, combine the condensed coconut milk, whole fat milk, 450g of coconut water, and sugar over low heat. Stir the mixture continuously to ensure it is well combined, but be careful not to bring it to a boil. Heat the mixture until it reaches 60 degrees Celsius.
  2. In a separate bowl, dissolve the gelatin powder in 25g of coconut water. Mix well until the gelatin powder is fully dissolved.
  3. Add the gelatin mixture to the pot with the coconut milk mixture from Step 1. Stir the mixture until the gelatin is fully dissolved and combined. Do not heat the mixture again.
  4. Remove the pot from the heat and pour the mixture into small cups.
  5. Chill the cups in the refrigerator for at least 4 hours or until the coconut jelly is set.
  6. Serve the coconut jelly cold with optional toppings like chopped mango and coconut flakes.

Enjoy your delicious and refreshing Coconut Jelly!

Updates from Google Data Summit 2022

Event Page: https://cloudonair.withgoogle.com/events/summit-data-cloud-2022

This is a summary of all the announcements we heard discussed during Google Data Summit 2022. Hopefully, this can help you easily to find all the updates. 

AI/ML 

  • Vertex AI Workbench (GA), bringing together Google Cloud’s data and ML systems into a single interface so that teams have a common toolset across data analytics, data science, and machine learning. 
    • With native integrations across BigQuery, Spark, Dataproc, and Dataplex
    • Data scientists can build, train and deploy ML models 5X faster than traditional notebooks. 
    • Fully managed compute with admin controls (serverless, scalable, enterprise ready Jupiterlab) 
  • MLOps tooling for AI
    • Introducing Vertex AI Model Registry, a central repository to manage and govern the lifecycle of your ML models. Designed to work with any type of model and deployment target, including BigQuery ML, Vertex AI Model Registry makes it easy to manage and deploy models. 
    • BQML is added to Vertex AI Model Registry (Public Preview), With this integration, BQML users can leverage Vertex AI for MLOps. Specifically, users can:
      • Register BQML models with MR
      • Deploy BQML models directly from Vertex MR to Vertex Deployment end points
  • Powerful AI compute without movement of data – Serverless Spark; BQML
  • Matching Engine – Vertex AI Matching Engine provides the industry’s leading high scale, low latency, vector-similarity matching (also known as approximate nearest neighbor) service, and industry-leading algorithms to train semantic embeddings for similarity-matching use cases.

Data Analytics and BI

BigLake

  • BigLake (Preview)
    • Is a storage engine that unifies data lakes and warehouses by providing consistent fine-grained security and performance acceleration across multi-cloud object stores. This extends BigQuery capabilities to open formats & OSS engines
    • Provides an API interface that enables customers to query data across GCP & OSS engines, enabling customers to unlock new use cases without writing new infrastructure

Bigquery

  • Analytics Hub (Preview) , a fully-managed service built on BigQuery that allows our data sharing partners to efficiently and securely exchange valuable data and analytics assets across any organizational boundary. 
  • BigQuery Storage Read API (GA), allows OSS Engines to read data from Bigquery Tables
  • Bigquery BI Engine SQL Interface (GA), BigQuery BI Engine is a fast, in-memory analysis service that lets users analyze data stored in BigQuery with sub-second query response times and with high concurrency. It is compatible with any BI tool that uses BigQuery. 
  • Log Analytics Capabilities in Bigquery (Preview) – Log Analytics gives you the analytical power of BigQuery directly in Cloud Logging with a new user interface that’s optimized for analyzing logs data. With Log Analytics, you can use SQL to perform advanced logs analysis. 
  • Native JSON (Preview) – By using the JSON data type, you can ingest semi-structured JSON into BigQuery without providing a schema for the JSON data upfront. This lets you store and query data that doesn’t always adhere to fixed schemas and data types. 
  • Search (Preview) –  Fully managed search indexes let you pinpoint specific text in data of any size across multiple columns and semi-structured datasets using SQL

Spark on Google Cloud

  • First in industry Serverless Spark for All Workloads (GA), Auto-scale without any manual infrastructure provisioning or tuning for Spark. Empowers customers to shift from managing clusters to workloads.
  • Pervasive Spark Experience (Preview), Connect, analyze and execute Spark jobs from BigQuery, Vertex AI or Dataplex in 2 clicks, without any custom integrations, using the best of Google-native and Open Source tools.
  • Flexibility of Consumptions (GA), One size does not fit all. Choose between Serverless, Google Kubernetes Engine (GKE), and compute clusters for your Spark applications.

Data Processing

  • DataFlow Prime (GA): Gain real-time insights and actions with industry leading serverless architecture. This update enabled key features like Vertical Auto Scaling, Right Fitting and  Smart Diagnostics

Security and Data Governance

  • Dataplex (GA) – unified platform for data management and data governance including features like Catalog, Quality control, Classification, Security, and Lifecycle management. 
  • Data lineage (Preview) – Record, visualize, and understand the relationships between data assets based on flow of data – mapping out the journey of how data assets are sourced and transformed. This will be part of dataplex.
  • Build a secure data warehouse with the new security blueprint

BI

  • Data Studio has been moved into the Google Cloud portfolio (from Google Ads). Data studio is part of the Looker team now.  It will still be free and it is just being strategically linked and further developed as a part of GCP’s overall BI strategy.
  • Looker + Data Studio Integration (GA),  Enable Google Data Studio users to read Looker’s Semantic Model 
  • Connected Sheets for Looker (Preview) – Google sheets to read Looker’s Semantic model. (Looker Enterprise Customer)
  • Better data blending in Data Studio – H1 2022
  • New filter shelf in Data studio – H1 2022
  • Looker for Google Slides – GA – H2 2022
  • Google maps in Looker – GA – H2 2022
  • Ask Looker – Preview Q2 2022
  • Anomaly and Insight Detection – Preview H2 2022
  • Improved Looker mobile experience – GA H2 2022
  • Looker for Tableau and Power BI – Preview H2 2022
  • Looker Marketplace Developer Program – Preview Q1 2022
  • Looker integration with data workflow orchestration tools like Apache Airflow or Cloud Composer – GA Q1 2022
  • VPC – service controls – GA Q2 2022

Data Analytics Accelerators

Data Management

  • Cloud SQL Insights for MySQL (Coming Soon) – this will provide the similar capabilities like we do have in Cloud SQL for Postgres
  • Cloud Spanner change streams (Coming Soon) – this allows you to replicate database changes in real time and enables easy access and integration with your Analytics Platform.
  • Modernize your Oracle workloads to PostgreSQL with Database Migration Service (Preview)
  • Migrate from Apache HBase to Cloud Bigtable with Live Migrations (GA)
  • Google Cloud Cortex framework: Data foundation for SAP & beyond

Resource Bank for Google Dataflow

This post is a resource bank for those who would love to learn more about Google Dataflow – Cloud Dataflow is a fully-managed Google Cloud Platform service for running batch and streaming Apache Beam data processing pipelines. The product team initially created this list, and I updated it with public-facing links. Hopefully, this is useful for you. Also, please feel free to comment below if you see other helpful links.

Coursera

https://www.coursera.org/learn/developing-pipelines-on-dataflow

Dataflow Academy

Complete Learning Resources on Apache Beam site: https://beam.apache.org/documentation/resources/learning-resources/#getting-started 

Hands-on labs provided by qwiklabs. You can register a new account : https://beam.apache.org/documentation/resources/learning-resources/#interactive-labs 

Getting Started

Code Samples

Guides

Community

Useful Documentations about Choosing the Right Datastore

It is important to understand the difference between Relational and Non-relational storage solution because different solution fits for different use cases and application type.

I have organized some resources below for you to understand the different between these 2 and hopefully after reading these you can:

•Know the different datastore categories available

•Know the different use cases for data store workloads

•Know the Azure and OSS datastore offerings available

•Design solutions that pick the right datastores for the use case

Choosing the Right Datastore

https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview

Criteria for choosing a data store

https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-comparison

Choosing a big data storage technology in Azure

https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage

Choosing a search data store in Azure

https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/search-options

Choosing a stream processing technology in Azure

https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/stream-processing

Choosing a batch processing technology in Azure

https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/batch-processing

Choosing a real-time message ingestion technology in Azure

https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

Thank you,

Your friend,

Annie

Power BI Adjacent Microsoft Services and Products

Power BI in itself has a lot of features and functionalities which help you with your job or business. However, Power BI itself won’t solve all your problems which in some cases you may want to peer with other services.

Power BI Premium Capacity

By default, most people will use Power BI pro license level. But when should you consider to leverage Power BI premium capacity? There are couple of things you may need to consider: 1. to store larger datasets, 2. Allow faster DirectQuery /Live Connection query, 3. Support Paginated Reports, 4. Allow unlimited Content Sharing for both internal or external viewers.

If you want to know more information about Power BI Premium Service, please refer to this Microsoft doc.

Azure Synapse Analytics

Azure Synapse Analytics (ASA) is a great alternative for your Power BI model Development. It allow you to query petabytes of data in a performant way with reasonable price by leveraging Direct Query mode of Power BI. ASA Data Pool is a MPP(Massive parallel processing) solution with the Result Set Catches and Materialized View built in so that it enabled high volumes of concurrency quires. Build your data warehouse in ASA as compare to Power BI you can provide Enterprise-grade data preparation and data warehousing solution which allows single source of trues for Power BI and other application and centralized security. In addition, the studio experience within ASA allow team collaboration amongst (data scientists, data engineers, DBA, and BI Developers).

For more information about this topic, please refer to this whitepaper.

AI and Machine Learning Services

There are 3 ways you can extend your BI data to AI insights within Power BI.

1. Auto ML in Power BI Dataflow; Official Doc Tutorial

2. Calling Existing Cognitive Services; Official Doc Tutorial

3. Call Machine Learning Models which Stored in Azure Machine Learning Services. Official Doc Tutorial

Azure Data Lake Storage Gen 2

By default, data used with Power BI is stored in internal storage provided by Power BI. With the integration of dataflows and Azure Data Lake Storage Gen2 (ADLS Gen2), you can store your dataflows in your organization’s Azure Data Lake Storage Gen2 account.

Simple put, Dataflow within power bi is Power Query + Common Data Model (CDM). With the Common Data Model (CDM), organizations can use a data format that provides semantic consistency across applications and deployments. And with Azure Data Lake Storage gen2 (ADLS Gen2), fine-grained access and authorization control can be applied to data lakes in Azure.

For more information about this topic, please refer to this official doc.

Azure Analysis Service

The core data modeling part of Power BI is actually Analysis Services. So using Azure Analysis Services (AAS), you can create data models independently from Power BI services. Right now the largest SKU of AAS allows you to store up to 400GB of data. You can allow different report applications (like Power BI, Tableau, Excel) to connect to your AAS models. Within Microsoft, The Analysis Services team is a sub team of Power BI. The product roadmap for Power BI is to minimize the data modeling experience between Power BI premium capacity and Azure Analysis services. That’s why are are seeing the recent updates from Power BI included those enterprise grade data modeling capability such as increased dataset limited, XMLA endpoint, and workflow pipeline, etc. However, most of those features are in preview and is about to go GA. In short, if you are migrating from your on premise SSAS model to Azure or you do not want to wait for the PBI features go to GA, then AAS is your easiest option. Otherwise, choose Power BI premium to host your reports and data models is strongly recommended.

Reference: https://azure.microsoft.com/en-us/services/analysis-services/#overview

Power Automate & Power App

When I think about Power BI with other Power Platform tools, I think it is a Insights + Action Play. You can use Power BI to get insights from your data and then use Power Automate or Power App to take action out of it.

The Bi-Directional Connectors allow you to connect the services between each other which means you can embed power bi report to power app and you can embed power app to power bi.

Reference:

  1. https://docs.microsoft.com/en-us/power-bi/collaborate-share/service-flow-integration

2. https://powerbi.microsoft.com/en-us/blog/power-bi-tile-in-powerapps

Azure Logic Apps, Azure Monitor log analytics

•Synchronously Refreshing a Power BI Dataset using Azure Logic Apps.

Reference: A Step by Step Community Blog Post.

•Use Power BI to understand your Logic Apps Consumption Data

Reference: A Microsoft Blog

•Use Azure Monitor Logs you can understand your Power BI embedded consumption. You can also use Power BI to create customized report out of your Azure Monitor Logs.

Reference: Microsoft Doc

Azure Streaming Analytics

This is a solution for true real time analytics in Power BI. It enables you to visualize your IOT or Events data.

Typical Architectural is Device Data -> Event Hub -> Azure Streaming Analytics -> Power BI Streaming Datasets -> Power BI Dashboards.

Reference: Official Document

My blog: https://xuanalytics.com/2019/03/24/simulated-streaming-data-power-bi-embedded/

Microsoft Teams

You can Add your Power BI report as an App to Teams.

Reference: Official Doc

Also if you have admin credentials to Teams, then you can Analyze Call Quality Data in Power BI.

Reference: Official Doc

Dynamics 365

•You can use Power BI to analyze the data in Dynamics 365 (OData or Entity Store from LCS)

•You can Pin your Power BI visuals to Dynamics 365

•You can use existing PowerBI.com solutions from Lifecycle Service

•There is a Power BI Embedded Experience within Dynamics (Boxed)

References:

  1. Official Doc Configure Dynamics 365 Power BI Integration
  2. Official Doc: PowerBI.com Service in Dynamics F&O
  3. Official Doc: Get Dynamics Data in Power BI using Business Central
  4. Official Doc: Customize Dynamics F&O analytical workspace

In summary, there many other services inside or outside Microsoft products portfolio works will with Power BI. So consider think outside of the box and create different solutions for your business.

Your friend,

Annie

Load data from cosmos db to Event hub using Azure Function at Azure Portal

One way to load data from Cosmos DB to Event hub is to use Azure Function. But although there is many coding samples out there to create such Azure Function. If you are like me do not have much application development experience, reading those code samples is bit channenging. Luckly, Azure Portal made is so easy.

  • Navigate to your Cosmos DB on Azure Portal, on the left pannel, you can find a choice under Settings called ‘Add Azure Function’

CosmosDB Azure Function 1

  • Fill the form popped up after you clicked on ‘Add Azure Function’ and choose the Function Language as C#. Then click ‘Save’
    • CosmosDB Azure Function 2

 

  • The portal will automaticly navigate to the Azure Function Apps page. and you will find the newly created Azure Function there. The next steps for us is to add the data target. In this case, it is Event hub. To add the Azure Function output target,
    • Click on ‘Integrate’
    • Click on the ‘Output’ New Output in the center form
    • Then choose ‘Azure Event Hub’ from the output choices
    • Save
      CosmosDB Azure Function 3
  • After you hit save, you will see the output form. change the Event Parameter Name field to be ‘$return’. And make sure the ‘Use Function Return Value’ is checked before ‘Save’
    CosmosDB Azure Function 4
  • The next step is to update the Azure Function run.csx code to create the data integration code.
    • Click on the FunctionName itself and the run.csx editing page will show up.
    • Then change the current code into this code
    • And click on ‘Save and Run’
    • Please note where I highlighted in yellow, depends on which run time version of Azure Functions App you have. the reference file can be either #r “Microsoft.Azure.Documents.Client” or #r “Microsoft.Azure.DocumentDB.Core”. So interchange those two if you run into error.
      • CosmosDB Azure Function 5
  • Now, your azure function is up running and capturing the changes and sending the changed files to event hub.

 

Thanks,

Your friend, Annie

Microsoft Bot 101

There are a lot of concept from Microsoft Conversational AI services. My teammate and I organized the following concept and resources for you hopefully it helps you get started.

  • Types of Bots:
    • Knowledge

This type of bot is result from knowledge base. A knowledge base is a pre-stored database which host Question and Answer pairs.

Source Link

  • Enhancement

This is a helping conversation part built as an enhancement of existing working application

  • Functional

The bot itself is a full application

  • Virtual Assistant

It is an open-source Virtual Assistant bot solution provides you with a set of core functional capabilities and full control over the end user experience.

Source Link

  • Technologies
    • Q&A (Aka Knowledge base stores question and answer pairs) Link
      • Score

when you asking the bot a question, the bot will return a score where how much % it matches existing questions in the knowledge base using MS internal fuzzy match logic

  • Metadata

It allow us to separate one information to the next. Metadata is data about data. By adding Meta Tags for you question and answer pair in a Q&A KB, it can better filter and answer for the end user.

Source Link

  • Single/Multiple Turn

The distinguish between single turn and multiple turn is that multiple turn conversation has follow-up questions

Source Link

  • LUIS (

Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to natural language text to predict overall meaning, and pull out relevant, detailed information.) Link

  • Intents

The overall meaning of the phrase

  • Entitles

The detailed pieces of information that LUIS extract

  • Patterns

A pattern is a combination of expression matching and machine learning to increase intent and entity prediction

Source Link

  • Utterances

Text in users own words

  • Natural language process

The process which changes users unstructured text input to and return a JSON-formatted response with a top intent. LUIS support prebuilt NLP model as well as custom model for domain-specific natural language models.

  • Adaptive Cards

They are a way to present a self-contained UI to a user within a larger UI such as a chat client.

Source Link

  • OAuth

OAuth (Open Authorization) is an open standard for token-based authentication and authorization on the Internet. You can add Authentication capability to your bot Source Link

  • Bot specific concepts
    • Channels

A channel is a connection between the bot and communication apps.

  • Turn

Request and response in and out model

  • Events

Microsoft Teams sends notifications to your bot for events that happen in scopes where your bot is active.

From <Source Link>

  • Conversation

A set of request and response groups

  • Prompts

prompt is used whenever a bot needs input from the user

From <Source Link>

  • Dialogs

you can use dialogs to model a conversation and manage conversation flow.

From <Source Link>

  • State

State data can be used for many purposes, such as determining where the prior conversation left off or simply greeting a returning user by name.

From <Source Link>

  • Dispatch

If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input

From <Source Link>

Thank you for reading.

Your friend, Annie

Power BI Refresh Failed – Trouble shooting

Issue: Few Power BI datasets have been successfully refreshed but recently have some inconsistent refresh failures

Possible Problems and Solutions:

a. License level: Power BI Premium license has different level and Power BI Premium capacity is shared within a Tenant and can be shared by multiple workspaces.The maximum number of models (datasets) that can be refreshed in parallel is based on the capacity node.  For example, P1 = 6 models, P2 = 12 models and P3 = 24 models.  Beyond that, you must understand how memory is managed by the capacity.  If memory is consumed by interactive processes (users interactive with reports)  the refreshes will go into a queue or possibly fail depending on resource utilization.  The best recommendation I can provide is to read this whitepaper https://docs.microsoft.com/en-us/power-bi/whitepaper-powerbi-premium-deployment, which provides much more detail.  In addition, I would recommend to install the Power BI Premium Capacity App and begin to analyze the model refreshes.  This should assist in identifying and building a model refresh pattern that minimizes/eliminates refresh failures.

  1. So to solve the issue, try limit parallel refresh or
  2. Increase the Power BI premium license level through O365 admin portal. Or, purchase Power BI Embedded Capacity through Azure Portal then assign the dedicated capacity to workspaces

https://docs.microsoft.com/en-us/power-bi/service-premium-what-is#subscriptions-and-licensing

https://docs.microsoft.com/en-us/power-bi/service-premium-what-is

b. Source Database Activities: check on the source database activities during the time datasets are been refreshed. There are might be database locks or database backups going on.

c. Network: Network communication through data source server to Power BI server which is in Azure cloud might be another reason. Please check on this community link for trouble shooting https://community.powerbi.com/t5/Service/Power-BI-dataset-refresh-failing/td-p/196905 . If network is the issue, we can further explore other network related solution such as adding Azure Express Route.

Thanks,

Your friend Annie.

My first week at Microsoft

On Monday March 3rd 2019, I started my journey at my dream company Microsoft. I would love to share the experience I have acquired over the past few days.

How did I get in

You may be wondering why I’ve decided to join Microsoft and how I got here, so let me tell you my story leading up to today. After finishing my master’s degree in Information Systems from Indiana University, I joined Anthem (one of the largest health insurance companies in the industry).

My team back then relied heavily on Microsoft products and worked closely with its supporting team to build its world class data infrastructure to support Anthem’s customer operation business. From that experience, I was not only fascinated by its products and services but also its community, filled with passionate and supportive individuals who love to learn and share.

Feeding off of this passion, I began joining local PASS SQL groups, attended and presented at SQL Saturdays, got my Microsoft certification with SQL Server 2016, and started this very tech blog you are reading. After a few years in the work place, I started to sense that cloud, big data and AI were and are the future of our world, and that the space provided me with a unique set of challenges to solution problems and share my knowledge with others, both being things I love.

In order to broaden my cloud technology skills and get more involved with direct customer interactions so to help them with my skills, I accepted the offer from MCAConnect a Microsoft Golden Partner consulting firm as senior consultant. From that experience, I got the opportunity to leverage Microsoft cloud technologies to help customers of different sizes, working directly with executive or management level clients to build solutions for their business needs. I took great pride in being able to leverage my knowledge in Microsoft technologies to help impact, change, and improve the various business processes of varying clients.

Yet despite the opportunities I was given, I still casually found myself day dreaming of one day pacing around the Seattle office with a fresh hot of the printer Microsoft badge, swimming in one of the richest pools of knowledge in the world. It wasn’t until the end of 2018 that my day dreaming became a reality. I had attended SQL Saturday in Washington DC, and after the event, the Microsoft One Commercial Partner (OCP) group reached out to me for a few openings. I had reached out to my Microsoft friends, my mentor James and the hiring manager Justin to understand better the job and Microsoft’s working culture. I found out that the Cloud Solution Architect for Big Data and AI under Justin’s team was the best fit for me and aligned well with my personal goal, and shortly after our discussion, I applied for the position.

After 1 general interview and 2 technical interviews, I got the offer.

To be honest, I felt like I had bombed my technical interviews as I wasn’t able to answer a ton of the questions asked, but it wasn’t until later, after reading Microsoft CEO Satya Nadella’s book ‘Hit Refresh’, that I realized why it actually made sense that I got the offer. Microsoft doesn’t hire know-it-all’s but the learn-it-all’s, welcoming people with a growth mindset. The moment I got the Microsoft offer, I admit I was ecstatic, more so than I had ever been in my professional career, as it was a truly elusive dream come true moment. Of course, after waiting through a lengthy hiring process, done through an internal career system which tracks progress and collects the necessary documents, I found myself talking with my HR manager about onboarding, a complicated process due to the size of the company. Along the way, my manager and the HR manager worked diligently and patiently, communicating with me regularly throughout the 1 month process, reducing my anxiety levels significantly.

Redmond, Seattle and Microsoft Campus

My New Employee Orientation was set to be at Microsoft’ headquarters located in Redmond, WA. My flight, hotel, and car rental were pre-booked by the company based on my schedule. Last Sunday evening, I landed at Seattle Airport, and from the rental car office to the hotel, I got all kinds of Welcome to Microsoft signs and greetings from the service companies, an experience comparable to a home coming. The hotel I stayed at was the Marriott in Redmond town center, a lovely shopping center with all kinds of restaurants, fitness studios, and shops.

Next morning, I drove to campus to attend New Employee Orientation (NEO). The building was called building 92, and at first, I thought it must have been a random number as I could not imagine any company having more than 90 buildings on headquarter campus alone. It was, however, true, and to my astonishment, the campus was still expanding. It was so large that transportation between buildings often relied on shuttle cars. It had multiple walking/running trails, football and basketball courts, on campus shopping centers, like those in Disneyland, and for every 3-4 buildings, there was a food court, summing up to over 30 food courts on campus providing all kinds of cuisines.

The campus was covered with trees, creating the feeling of working in a park. Later, my manager showed me and one of my teammates the on campus treehouses designed by someone famous, though I couldn’t remember the name, and those treehouses were accessible by all employees based on reservation. Then for the next few days, I visited multiple buildings. The offices were illuminated with bright natural light and were separated into different rooms, enhancing my mood as well as providing a feeling of privacy. It fit the culture of Microsoft, formal but fun.

Also, very important, provided for free on every floor and corner, was a variety of drinks and office supplies free for the taking.

New Employee Orientation (NEO)

NEO happened in building 92 which was the visitor’s center. It started with new employees (NE) gathering in the product showcase room where we had access to coffee, Xbox, VR, and networking opportunities. Some posed for a picture in front of a photo featuring the first team of Microsoft employees all of different ages, genders, and cultural backgrounds.

At around 8:00 a.m., we were led through different stations to process our I-9, get our photo taken for employee badges, and have breakfast before entering the conference room where the NEO meeting was hosted. In the room, we were randomly assigned to different tables, done on purpose so for us to interact with individuals from different areas to gain exposure to the different groups in the company. Leading hosts of this events were senior leaders and employees, all who were volunteers. It started with simple greetings and storytelling exercises for each table. Then one of the hosts delved into the company culture and his experiences at Microsoft, followed by the HR manager introducing the benefits which Microsoft provided. During this session, nearly everyone around me, including myself, whispered amongst each other, doubting what we heard and checking our sanities amongst each other. Are all of those really free and reimbursable? I have to say that although the general benefits including 401k, health insurance, family support, Health Savings Accounts, and employee stock purchase plan were amazing, the ones that caught my attention were the fitness benefit, tuition support and donation matches, as my health and growth are the two stabilizing factors to my career.

Following the benefits introduction, there were more stories and program introductions provided by different speakers, all of which reinforced my excitement for finally being here. During lunch, hiring managers came to the conference room to greet their new hires before taking them to lunch. My manager Justin came to welcome me and my new teammate Alex, and proceeded to take us to the café building nearby to provide us with our new PC. After lunch we were sent back to the NEO building to finish up with our security settings and the benefit vendor tour.

NEO officially ended at around 2pm. Justin soon after gave us a brief introduction meeting and provided some tips on what to expect for the next couple of days and weeks. He also told us there was a TechFest on campus happening this week so in lieu of a short Q&A meeting with him each day, we were welcome to attend the event or leave earlier to check out the ins and arounds of Seattle.

TechFest and Seattle area

TechFests, in conjunction with other events, happen every now and then and offer various keynote sessions and presentations. Here, you can learn of the latest research happening in the tech word as well as inside different teams at Microsoft. This week’s TechFest was about AI and machine learning. I was so impressed by all the new inventions and studies out there in our field, and felt so fortunate to be so close to the inventors and researchers, allowing me to talk directly to them about their ideas while asking questions. It was quite an experience to be surrounded by such creative and curious individuals, all so very accomplished and intelligent, all leaders of the industry.

After each day, I had a chance to visit the Seattle area, a place where I have never been before. It was such a beautiful city with lakes, ocean, and mountains, home to multiple large corporations such as Microsoft, Amazon and Starbucks. There were so many hiking, running, and biking trails, various outdoor activities, a paradise for the adventurous soul. It was also one of the more foodie cities I have seen, as all of those companies and universities embrace diversity, offering a vast choice of ethnic cuisines from around the world.

My experience during my first week at Microsoft truly reflects its mission statement “Empower every person and every organization on the planet to achieve more”, reflecting its culture of valuing growth mindset. Microsoft is a place where an aspirational technology professional as myself can use as a platform to empower my skillsets to impact the world.

I am ready for this new chapter of my career.

Thanks for reading,

Your friend, Annie

You may need a table variable for IRR and NPV for DAX

I finished my Fundamental Finance Accounting course last week. I learned 2 awesome functions which helps which which projects to undertaken. Frist is NPV net present value and the second is IRR(Internal Rate of Return). I found those two are so powerful to use helping to decide which investment to choice so I want to implement in DAX. Fortunately, in DAX there are two functions XNPV and XIRR which I can choose to use. I found that XNPV and XIRR are also in excel and would be a replacement of IRR and NPV.  They have slightly different calculation details but the purpose are the same. You can find http://www.differencebetween.net/business/finance-business-2/difference-between-npv-and-xnpv/ and https://corporatefinanceinstitute.com/resources/knowledge/modeling/xirr-vs-irr-excel-financial-model/  for more details.

The Excel XNPV function is a financial function that calculates the net present value (NPV) of an investment using a discount rate and a series of cash flows that occur at irregular intervals. Calculate net present value for irregular cash flows. Net present value. =XNPV (rate, values, dates)

The Excek XIRR(Internal Rate of Return) is the discount rate which sets the Net Present Value (XNPV) of all future cash flow of an investment to zero.  If the NPV of an investment is zero it doesn’t mean it’s a good or bad investment, it just means you will earn the IRR (discount rate) as your rate of return. =XIRR(values,dates,guess)

The challenge in DAX, I found, is that , we may not have the date field. For example, we may just calculate a intial investiment and an estimate assuming paying amount every year or every month in the future. Something like this:

Assumption - IRR NPV

So in DAX, we can create table variable in the Measure to leverage XIRR and XNVP.

XIRR =
VAR cashflowtable =
    UNION (
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () )YEAR ( NOW () ) ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Initial Inventestment] )
                )
        ),
        GENERATE (
            GENERATESERIES ( YEAR ( NOW () ) + 1YEAR ( NOW () ) + [Investment Years] ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Annual Cash flow] )
                )
        ),
        GENERATE (
            GENERATESERIES (
                YEAR ( NOW () ) + [Investment Years],
                YEAR ( NOW () ) + [Investment Years]
            ),
            VAR inc = [Value]
            RETURN
                ROW (
                    "date"DATE ( inc11 ),
                    "Values"SUM ( 'Considered Project'[Exit Value] )
                )
        )
    )
RETURN
    XIRR ( cashflowtable, [Values], [date] )
XNVP =
VAR cashflowtable =
UNION (
GENERATE (
GENERATESERIES ( YEAR ( NOW () ), YEAR ( NOW () ) ),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Initial Inventestment] )
)
),
GENERATE (
GENERATESERIES ( YEAR ( NOW () ) + 1, YEAR ( NOW () ) + [Investment Years] ),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Annual Cash flow] )
)
),
GENERATE (
GENERATESERIES (
YEAR ( NOW () ) + [Investment Years],
YEAR ( NOW () ) + [Investment Years]
),
VAR inc = [Value]
RETURN
ROW (
"date", DATE ( inc, 1, 1 ),
"Values", SUM ( 'Considered Project'[Exit Value] )
)
)
)
RETURN
XNPV ( cashflowtable, [Values], [date], [Interest Rate] )

Hopefully, you find this helpful.

Thanks,

your friend, Annie

Blog at WordPress.com.

Up ↑