On November 19, 2020, we walked into the developer of “Non-ending Ullala” and “Sausage Party” “It’s Fun”, held a high-end salon for game data analysis in the library of the “Non-Stop Ullala” Teacher He Peng, the person in charge of the data project of “Ulala” shared. In this event, based on the practical experience of the TA system in the Ullala project, Mr. He Peng analyzed how to use the TA system to improve the utilization of project data. This article is a partial record of the content of Mr. He Peng at the event site. Those who need to watch the full video can visit the official website of Digital Technology.
Hello everyone, I am honored to be invited to share what is really interesting internal, mainly the practical experience of data analysis of the Ullala project. It should have been less than a year since we connected to the TA system. In such a short period of time, I actually feel that the overall project management of our Ullala and the utilization of Ullala’s entire data have improved a lot. Today’s sharing mainly consists of 4 parts:
01. Why did we choose to access the TA system
We have access to the TA system to give everyone in the project the opportunity to analyze data through this platform. Even if you don’t have a basis for data analysis before, you can explore on the TA system. For example, you don’t understand some terms such as API. What do you mean, but first you have to have a convenient platform for you to watch it. As long as you watch it, you will definitely have a concept of what you want to analyze. This is also implemented in our really interesting values-data-oriented , User first, user-oriented, make products that users like. We provide users with more choices and bring long-term and effective happiness to users. The logic is that not only do we bring to users from our own experience or our project experience, all the foundations are actually based on data. Users Do you like it? Whether it’s popular or not, you still have to use the data to speak, you can’t just think about it, otherwise the project will not last long.
First of all, let’s talk about the situation where Ullala did not have access to the data science and technology before, that is, when the Ullala project was launched, the data architecture we used can be said to be the family bucket of Alibaba Cloud. We started data collection and formed a service base to be a local gameserver. Including the collection of logs from all servers on the server side, and then forward them to Alibaba Cloud, and then some of them are to OSS to do log machine query, but its query efficiency is not high, because it only has a 30-day validity period, and for This product, including your SQL capabilities, has high requirements, and then do some data cleaning, and finally the main data query, including data analysis, is through DIH query, query the cleaned data, and generate a large wide table. But the query efficiency is very low. We will divide tables according to different events and classify all users. We need to use some user attributes, because the DIY system can only do SQL queries. For data analysis, the main Relying on data analysts, including data development, the data analysis we want is to turn the data. The first is the design of data burying points, and then data storage, data analysis, and early warning of data analysis. This part is mainly for data analysts, including some students who operate our platform, and then because planning may be for SQL It is not that popular. Only a few meetings, they may need to connect with data development students, and after repeated confirmation, can they get the data results they want, but the process of these queries will take up a lot of energy. The rate will be very low, and the entire buried point design including the effectiveness verification efficiency is very low. Our data is T+1, but it may be even slower to turn around.
In the beginning of the data design unilaterally, we designed the entire project because we did not have many data developers. The efficiency of data collection and collection is that T+1 will be very slow, including data cleaning. If it is wrong or there is a problem, the whole has to be restarted. The timeliness of data analysis is not strong, and the applicability is also very poor. You can only use scripts or SQL or third-party BI to do data display, but its customization is relatively weak. We plan or operate. If we want a report and want to monitor a page in real time, he must first make a request, and then wait a long time to see the report. It may take a week or more, because most of the time is doing data connection. , Data verification, the docking consumes too much energy, and the efficiency is very low. Although he put forward the analysis requirements, the data group may not respond so quickly because of insufficient manpower, or I can’t respond now. When I do it, maybe this activity has ended, so I can’t see the data changes of this activity.
02. Problems encountered during access to TA and their solutions
In order to solve this problem, we finally decided to connect to the TA system. The first problem we encountered was that our text data structure was the same as the original data structure of Digital Technology, but they had some data, such as keywords. Segments, methods of burying points, and our data are not only for our own use, because there may be third parties such as cooperation with Xindong. We cannot completely change our original burying points, and the entire program is refactored. In this case, the cost is relatively high, so we added a layer of data forwarding in the middle to convert our existing buried field into what they need for digital technology. For example, the one that enters the stands is to do a layer of mapping, because they count them. The monitoring text data format used is more general, so it turns smoothly.
Next, let’s share the real-time data, because our technical station in Ullala is enough. If we want to operate and maintain two sets of environments, we will continue to use our previous components, but we need to consider the interface issues. Using their API interface to re-adjust a data format, we used a lighter data collection and forwarding, and forwarded it to the TA through the ATP interface, which solved the real-time problem. After testing, it is no problem that there are tens of thousands of data per second. Thousands of items were uploaded at the same time and no data was lost.
Let me share the historical data again. The import time of historical data is relatively long at the beginning. One of the problems is how we can derive the data to the number platform to ensure the uniqueness of the data and be more efficient. Fortunately, TA has various data. The access interface is still very efficient throughout the test, and it is also very smooth to use. Let me talk about the counting analysis platform, I will mainly talk about our own experience.
First look at the process part. If it is our traditional BI report, the first problem we will encounter is the problem of complexity, because your needs must be considered at the same time, at least involving the R&D and platform departments, and the platform must be divided into back-end and The front end is responsible for the calculation and display parts, and finally you do an acceptance. This process takes a long time and spans a long time. It is very difficult to cross-department. If there is a bug in the middle, is there a problem with the data reported by the research and development, or the calculation logic of the platform, or the platform? There is a problem with the front-end display, which is difficult to troubleshoot. The whole process will be very long and its cost is also very high. After accessing the number of TAs, our current process is still to propose requirements, but we only need to develop a data embedding point. The data of the embedding point just mentioned will be directly returned to the database of our platform itself, and then we just For the data source, the data itself can be accepted. For example, let’s run his generated data event to see if the upgraded data is correct, and then we can make our report. The overall complexity and time are greatly optimized. So we can say that a process optimization brought about by the modular tool set provided by the TA platform is more significant.
Next, let me introduce to you our application experience under the TA system.
First of all, event analysis and process analysis are what we often use. Out of the box, it greatly improves efficiency. We are analyzing daily activities or daily activities of different events. If we use our own DIY system to query, we know how Check, but the SQL has to be rewritten. In fact, for data analysts, the cost will be relatively high. For example, if a new gameplay, a new gift package or a new pet is on, all the analysis must start from 0 Start. Then complex analysis means our funnel, including path analysis and data drilling, it will explore the conversion of each step in the game, including the process, which can actually be analyzed to analyze user preferences and the funnel analysis of key nodes. If you write SQL, it is actually difficult to perform path analysis from scratch. The more you analyze the data, the complexity of the entire data will increase exponentially. Through the products of Digital Technology, we can simply operate, add an event directly, and the entire analysis process It will be very smooth and convenient.
User portraits are user groups, user tags, and user attributes. User grouping grasps the grouping characteristics of our designated users from multiple angles. We analyze the current situation of the designated users according to the characteristics of the user and some of the user’s behavior attributes, including payment and activity, including whether it participates in an event. Sometimes we analyze a pet in Ullala, and recently launched a pet evolution function. Before we set up the project to do this function, we can analyze the current pet players, that is, the player’s participation rate of the pet module and his payment in this respect. After the functions are online, we can compare them. For example, he was a heavy player playing pets heavily before, or he was not interested in the function of pets. After we go online, for this part of the group, that is to say, after we have divided the tags, we can see the conversion situation, whether we like to play more, don’t like to play, and whether we are more involved in the evolution of new pets. Module.
Custom query is very helpful for data analysis and data development. We can achieve the effect we want through backtracking for existing models, or some data that we cannot cover.
For example, a custom SQL query may not be covered by the external UI panel. For example, if 20% of these requirements are met, we can query, then display, and feedback to our planning, including our operations.
Data backtracking, for example, in Ullala, we had no way to add new equipment before, because we didn’t have any client SDK embedding points before, so we only have data from the data source and data from the server, so we can’t go. To do, through data backtracking, we can use some device IDs collected in history to de-stock calculations, such as which devices are currently newly added, and vice versa, use device addition to see our new devices, including device additions The conversion rate from account creation to role, creation to login, is mainly the maintenance of the platform background.
In terms of authority, it is actually what I want to talk about today. It is also the Kanban sharing of the TA system. The changes that the TA system brings to us are the closer collaboration between multiple departments. The analysis results of different departments will not only be viewed by themselves, but shared When it comes out, everyone can check your report, and you can also watch your entire analysis process, which means that your data is traceable.
Dimension table plus virtual attributes. In fact, for data burying points, your previous burying point design may be different from your subsequent analysis, or there may be some unconsidered points. You did not directly bury the points. I will go through the dimension table. Mapping, such as some of our media types, such as occupations. For some recorded time, the flexibility of virtual attributes will be higher, because we, Ullala, may have a difference in that we have a kimono. We have divided the first seasons and kimonos of the season. In fact, we only recorded that he was For a certain season, the kimono time of this season is actually not well recorded in the log. We can use the virtual attribute, which is the time when the server is opened in the season in which it is located, and then map it. We calculate the time of the event. , And then calculate the current day of the season that belongs to the start of the server, or the life cycle when this event occurs. In this way, when we are analyzing data, there will be more dimensional indicators to filter, or to group to analyze.
Alarm management is mainly for real-time monitoring of commonly used indicators. You can set your early warning value, and if there is an abnormality, you can send a message in time.
03. The changes brought by the TA system to the Urala project
The third part talks about what we have done after we connected to the TA system, and the changes that the TA system has brought to the Ulala project, from three aspects.
The first one is operational analysis. For example, different activities go online in different time periods. To see the players’ willingness to pay, we can group these projects and activities in the form of a billboard on the number platform. , Can continue to observe and observe its ARPU, if they are all placed in an interface, if the Kanban is finished, we will launch a new event later, and there is no need to restart to get the ARPU of this event. When it goes online at a fixed time, there will be a display in our report, which is very helpful for us to continue to observe or compare our historical online projects.
The second point is that we can continue to observe the changes in the indicators. For example, the effectiveness of creation requires multi-faceted judgment, not only comparing with ourselves, but also comparing with similar projects to see if there is any improvement. For the CPI, we will set its early warning value. If a project cannot even reach the CPI, there must be a problem, and we must sum up experience to optimize it in the future.
Finally, it is mainly business analysis, such as gameplay analysis, such as pets. We previously launched three super pets tiger, wolf, and dragon. For the pet we launched, as long as it is drawn and evolved, the activity of a user after it, including retention, will actually be greatly improved. In fact, before we had access to the TA system, we were mainly planning to raise demand and we analyzed it, but after accessing the TA, the evolution of pets this time was planned and analyzed by ourselves. Because the whole planning is done by them, they have a deeper understanding of the modules. If the planning is to analyze, he will have a deeper understanding of the business logic or a design goal he wants to achieve, and his analysis may be more convincing. Strength, our data analysis may have more energy to verify it and analyze it from our perspective. We will have analysis in each version, and we will analyze the activities from different angles. Let’s see if this conclusion is all positive, or completely opposite. The launch of the TA system has a positive effect on this kind of thinking collision.
04. How to better use TA systematization
Finally, I will talk about the expectation of the follow-up cooperation between Ullala and the TA system. I also hope that our cooperation can go further. First of all, there is a good tool that can liberate most of the productivity. Our previous docking planning and operation requirements would take up a lot of working time, so there is no more energy to analyze and analyze other functions of the project from our perspective. Using the TA system also improves data analysis capabilities, which actually requires a deep understanding of the business and rigorous logical reasoning capabilities. I think that in addition to students who plan or specialize in data analysis and operations, in fact, those who know more about business or have stronger logical reasoning ability, I think that program development also has a more say. I hope to enhance the company’s overall data awareness. Not only the company’s senior executives or products, such as programmers or UI art, but relying on good tools such as the TA system, can cultivate everyone’s data awareness and reflect the data that users do for us. How much effort did you spend on things? Was it paid or how much? What was the user’s passion? Only in this way can we have a sense of accomplishment in what we have done, and better promote us to optimize our development capabilities. If you see our online bugs, you can truly see the data. For example, today’s daily life is 1 million, because of a bug, the daily life is lost 100,000. Are you more motivated to do your own business modules, and then give a positive feedback to the entire project?
Finally, I want to talk about the cooperation with Shushu, and we have discovered the value of data together. Data is the carrier of information and the feedback from players on our game products. We hope to adhere to the user first, improve the effectiveness of creation, and continue to enrich our data analysis ideas, and fully explore the value of data. I hope to have more exchanges with the Shushu team and continue to improve efficiency. Thank you.
Source: Game Tea House