In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-01-17 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Development >
Share
Shulou(Shulou.com)06/03 Report--
This article mainly explains "how to understand the integration and collaboration of core components of low code development platform". The content of the explanation in this article is simple and clear, and it is easy to learn and understand. let's study and learn "how to understand the integration and collaboration of core components of low code development platform"!
Overview of low-code platform
For the low-code development platform, the Baidu entry has a basic definition, as follows:
The low Code Development platform (LCDP) is a development platform that can quickly generate applications without coding (0 code) or with a small amount of code. Through the visual method of application development (referring to the visual programming language), developers with different levels of experience can use graphical user interfaces, use drag-and-drop components and model-driven logic to create web pages and mobile applications. After completing the business logic and function construction, the low code development platform can deliver the application and update it with one click.
If you extract the key elements of this definition, the core includes:
No coding or a small amount of coding is required
Visual and configurable to build applications through assembly or configuration
In addition to these two points, there is another aspect of process support, that is, the entire developed application launch or delivery process should be simple and automated enough, including the above-mentioned implementation of immediate configuration, one-click delivery and so on.
At present, there are many service providers that provide low-code development platform, although their solutions or overall architecture are different, but the essential content is basically the same, that is, everything is configurable and modelable.
You can imagine the process of developing a simple function, that is, database table design, front-end interface design, writing logic layer code and interface to realize business rules, connecting process engine to realize process, configuring functions and data permissions, and so on.
So any low-code development platform needs to abstract and model around this core, and find out common things that have nothing to do with business for technical precipitation, as we often say.
Things that are completely standard are directly standardized.
For things that are not standard but in the same scene, parameterized configuration is realized by abstracting differences.
As you can see, in fact, the construction of our LCDP platform is basically based on the above ideas. So what is the core element of a LCDP platform? I have drawn a new picture to illustrate it.
That is, the core of the LCDP platform includes several key parts of data modeling, form modeling, process modeling, permission modeling, report modeling and rule modeling. Through these modeling components, including the collaboration between these components to complete the construction of a complete business system and functions.
Object modeling driven
A good low-code development platform should be object-driven, which is similar to the idea of MDA model-driven architecture discussed a long time ago. That is, first of all, the object modeling, where the object is more partial to business objects or domain objects, an object itself can correspond to multiple database tables, can be hierarchical tables, can also be management structure tables.
After the object modeling is completed, the domain service capability interface can be exposed forward, and the database can be persisted back. When the domain service capability interface is exposed, it is easier to realize the complete separation and decoupling between the front-end and back-end logic. at the same time, it is easier to carry out related transaction control within the domain service implementation.
After using object modeling, an object service layer is actually added between the back-end model and the front-end interface, and the object service is provided through the API interface, which is consistent with the idea of SOA hierarchical application construction.
After the object modeling is completed, the object itself can generate database tables downwards, and the API interface service can be published upwards. The form modeling is no longer directly associated with the database table, but directly references the corresponding API interface service. In this case, the corresponding API interface service itself will also enable strong service contract pattern for definition and design.
When there is an independent interface layer, we can see that it will be easier and convenient to realize the upper layer function composition or assembly, that is, we can provide a tool similar to the traditional BPEL process or service choreography to visually assemble and orchestrate the interface of the upper layer business.
Form and process engine integration
At present, many low-code platforms in the market evolve from the traditional BPM software or workflow engine platform, so the process engine itself is a very important technical service capability support at the bottom of the low-code platform.
For the process engine itself is closely bound to the organization model for the corresponding fine-grained data access control and process dynamic access control, we do not describe the specific integration points of the organization engine and the process engine here, but focus on the integration of the form and the process engine.
Form design connection process
You can first look at the simplest scenario, that is, form design and process design itself are separate, you can design a process template in advance, generate a process template ID, and then start form design. After the form design is complete, you can choose a process template ID to hook up.
In this scenario, it is easy for the form to start the process when it is submitted, that is:
/ / GenerateFormID (); / / Form.Save (); / / StartWorkflow (Formid,WorkflowTemplateID,Userid)
Process reviewer fills in extended information
If the attached process, the processors of all process nodes are simply to review whether it is passed or not, and fill in the comments to be given, then the above processing methods are fully satisfied. But in more cases, you need to fill in the extended information during the audit.
For example, when a supplier creates an application form and transfers it to the purchasing manager for review, the purchasing manager needs to determine the level of the supplier and upload the relevant qualification information of the supplier. The two data items of supplier level and supplier qualification belong to the basic data item information of supplier object modeling, but they are not maintained when documents are submitted, but are maintained during audit.
In this scenario, we can see that we can not simply establish a simple association and mapping between the form and the process template, but should map between the form data items and the process nodes. The mapping granularity of individual data items is too fine, so you can introduce data packets at form design time to handle the mapping between data packets and process activity nodes.
As shown in the figure above, the form data is grouped and the mapping and authorization relationship between the process activity node and the form group is established. The overall form submission and approval process changes to:
When the applicant submits the form information, he can only see the basic information
When approval 1 enters the approval, you can see the extended grouping and submit the approval result for data maintenance.
Approval 2 can see three grouping information, but can only maintain data for extended grouping 2 and review the submission results.
In process monitoring, for the nodes that have been executed, the extended packet information can be seen.
That is, the above not only realizes the dynamic access control of the participants based on the process, but also realizes that the process participants can maintain and fill in the process form information in the process of auditing the process. After maintenance, the extended grouping information is still kept in the basic object data table, rather than in the process instance table, which is only responsible for state transfer and the calculation of process participants in the next stage.
Dynamic permissions + data access control
Dynamic permissions simply mean that you have permission to view form data during the process node, but your permissions are reclaimed when the processing is complete. Or you can only see the forms you review and process, not all the vendor form data. The data permission is to confirm which data items you can see in the entire data object, such as the previous group authorization is also a common way to control data permissions.
Decoupling form saving and process startup flow
When the process engine is implemented independently as a technical service, you can see that the process startup, process flow and so on are all completed by calling API interface services, then a distributed transaction scenario is formed.
It is still a good practice to need asynchronous decoupling between form storage and process API services, that is, triggering requests for process flow are first written to message middleware, and then asynchronously subscribe to message queue data to start the process or flow node.
Implementation of process rules
In the processing of the whole process, it also involves the processing and implementation of rules, and rules can be understood as two kinds.
One is the simple judgment rules, such as the reimbursement amount of more than 10,000 needs to be transferred to the general manager for review; the other is responsible for rule judgment, such as the need to conduct a detailed budget integrity check, pass and not pass the need to transfer to different branches.
Combined with the traditional process engine implementation, for simple rules, parameter variables can be passed, while for complex rules, we can consider directly calling external API interface services to achieve verification.
Here, only the data project information parameters that need to be used as process judgment are passed to the process execution, which itself reduces the resource load. For the transmission of Param information, one idea is to cache the process instance after it is started, and the other is to re-pass it in when each process activity node executes. The personal advice is to cache when the process instance is started.
Integration of form and organizational permission model
It has been mentioned earlier that object modeling should be separated from form modeling, that is, after an object modeling is completed, several different form models can actually be built based on this object, or take the supplier as an example.
Even if a supplier does not connect the process, it can be divided into different form functions, such as supplier complete information maintenance, supplier bank information maintenance, supplier fuzzy query, supplier qualification information query, specific supplier information query and so on. but these form functions all correspond to the same object model. After the form design is completed, the form function is formed. After a form is designed, it can be linked to a specific function menu, and it can also be bound to a specific action button or event. To put it simply, the form itself may have entry parameters.
A supplier query function form, the query result is a list, you can click on the detailed information link in the list, at this time you should adjust to the specific supplier view interface, then the supplier ID is an important input parameter.
We can see that the overall idea of access control is still based on role + resource access authorization.
A form is attached to a menu resource, and menus can be authorized to user groups or roles. Similarly, the data items or data groups in the corresponding form design can also be defined as resource objects that need fine-grained control, and the resource objects can be authorized to specific user groups or roles after the resource definition is completed. including buttons in the form can be carried out with this idea.
That is, after the completion of the form modeling, we need to abstract the resource object to the completed form, which can be an independent data item, data grouping or button operation. At the same time, the resource object is bound to the role or user.
Default settings for data item permissions
A data item permission configuration is best to support both default visibility and default invisibility. For example, if the purchase order amount is only visible to the purchase summary, then the default value for this data item is completely invisible, and the amount is defined as a resource and authorized to the Purchasing Director role.
Static and dynamic permissions overlap
What happens when static and dynamic permissions overlap? Generally speaking, it should be based on dynamic permissions, such as static permissions do not have permissions, but dynamic permissions can be operated or maintained after calculation, then the user has dynamic permission control over the relevant information in the execution of the dynamic process. Permissions disappear after the process is processed.
Forms and rules engine
For the low code development platform, in fact, I personally do not recommend the introduction of a heavy rule engine, to understand that if the real business rules are very complex, using the rule engine also requires a lot of script code to implement, and the script code itself is more difficult to maintain.
Since the rule engine came out for so many years, it is rare to see a very mature application scenario in the field of enterprise informatization.
As for the rules, we can look at them separately.
One is based on the current front-end form data can be calculated and verified rules, such rules are generally reference integrity rules, for example, when I order goods in the e-commerce platform, select 10 items, there are different discounts, but some discounts can not be shared at the same time, then the front end will be able to complete the rule calculation to give a best discount and total cost. For such rules, the form designer needs to be supported, and the best way is to be scripted, and you can write your own simple rule definition scripts and handle them.
There is also a class of rules that need to call back-end data to complete the calculation. For example, when submitting an application for accounting documents, you need to check whether the current budget is sufficient and satisfied in the background, and only those who are satisfied can be submitted.
The second type of rule is actually the scenario that the rule engine may involve.
That is, in the second type of scenario, the overall process can be understood as
Define rules in the rule model, which accept input and produce output
The form passes key param parameters to the rules engine
Rules cause data to be obtained from the database background based on param parameters
Rule calculation based on rules defined in advance
Return the result of rule calculation to the front end of the form
F if the rules are complex, you will see that rules cannot be implemented through simple configuration, and you still need to write a lot of rule script code. In this case, it is more recommended to implement your own custom API interface.
During the form design process, the custom API interface can be called for key event points.
The current rapid development platform is called low-code development, that is, do not expect all the complexities to be configured, especially the implementation of complex rules. The best way is for complex rules to still write their own code to implement the interface, and then be able to call their own written API interface methods during form modeling and process modeling.
Event is a standard concept in software development, which can be triggered by clicking a button, a drop-down selection box, a focus shift, etc. After the event is triggered, the form's own processing logic is invoked without rules.
Such as the save button, after the event is triggered, the form save operation is called to save the data. But in fact you will see that you may need to do business rules and logic processing before saving, and you may trigger other related actions after saving.
/ / Form.SaveBefore (); / / Form.Save / / Form.SaveAfter ()
Currently, you may also call multiple API interfaces to perform multiple checks before saving.
Another key point arises at this point, that is, not only to support calling the external API interface before and after the event, but also to support simple orchestration of the API interface.
In fact, you can see that for a complete low-code development platform, this open orchestration capability is necessary in the face of a variety of complex business scenarios. This kind of orchestration is actually consistent with the service composition or process orchestration on SOA.
Service composition choreography
Or take the purchase of e-commerce products as an example, when submitting an order, if you need to deal with three actions at the same time. That is, the save operation of the order object is called, the inventory deduction of the inventory object is called, and the distribution order of the distribution object is automatically created.
In this scenario you can see that it is difficult to achieve without service composition and orchestration capabilities.
The interface capabilities exposed by object modeling are further visually assembled and choreographed to form composite service capabilities and then exposed to form design.
It is necessary for a good low code development platform to refer to the idea of SOA hierarchical architecture and domain modeling, that is, the ability of API interface service is exposed after the object modeling design is completed, and the form design of the foreground is to interact with the API interface service layer. This design method is convenient for the subsequent expansion of service capacity scheduling.
Even if there is no visual service orchestration capability in the previous stage, we can customize the API API service capability to access it.
Thank you for your reading. the above is the content of "how to understand the integration and collaboration of core components of low-code development platform". After the study of this article, I believe you have a deeper understanding of how to understand the integration and collaboration of core components of low-code development platform. Here is, the editor will push for you more related knowledge points of the article, welcome to follow!
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.