ORM Framework Series of Wheelmaking Motion~Talk about ORM Framework in my mind

ORM Conceptual Analysis First, comb out the c...
ORM Conceptual Analysis

ORM Conceptual Analysis

First, comb out the concept of ORM. The spelling of ORM is Object Relation Mapping (Object Relation Mapping), where Object is the object in Object-Oriented Language. This paper uses c# language, so it is a.net object. Relation Mapping is a relationship mapping, which refers to the relationship mapping between.net objects and database tables in the database system.

Why do.net objects map to database tables?

The answer is obvious, of course, because the structure of database tables is very similar to that of objects. If a mapping relationship is established between the two, we can easily manipulate database tables in an object-oriented language like ordinary objects, which is good news for programmers.So who should establish the relationship between the two? After all, they can't be made out of nothing. There must be a "matchmaker". This "matchmaker" is called ORM framework, or database persistence framework.

In the last article ORM Framework Series of "Wheel-Making Movement"(1)~Talk about my changes and additions in actual business In, I talked about the additions and deletions that I encountered, and analyzed the advantages and disadvantages of native sql, Lambda to sql, stored procedures, etc. Among them, I slowly formed my own understanding of ORM framework, and gradually drew a perfect ORM framework blueprint in my mind.

Before drawing a blueprint, give my ORM framework a name called CoffeeSQL. This name comes from the hope that I can save time at work to enjoy a cup of Coffee when I use this framework.

1. Entity Mapping

c# objects want to establish a mapping relationship with the database tables, so record the configuration information for the mapping relationship. I prefer to configure it in a straightforward way: by identifying the Attribute directly on the field of the class.

The basic usage is as follows:

1 /// <summary> 2 /// Entity Class 3 /// </summary> 4 [Table("T_Students")] 5 public class Student : EntityBase 6 { 7 [PrimaryKey] 8 [Column] 9 public string Id { get; set; } 10 11 [Column("Name")] 12 public string StudentName { get; set; } 13 14 [Column] 15 public int Age { get; set; } 16 }

You can see that we use three Attributes in this entity class, TableAttribute, PrimaryKeyAttribute, and CloumnAttribute, which we believe are understandable.

2. lambda operation, add, delete, check, strong type

In some of the most basic information addition, deletion and change checking functions, sql operations on a single table are very frequent. I recommend using lambda as a form of operation, first because it is convenient, and second because the single table operation on the converted sql from Lambda TO SQL is the simplest, so you don't have to worry about its performance.

The specific code operations are as follows:

1) Increase

1 //Add 2 dbContext.Add<Student>(new Student { Id = newId, StudentName = "WangTwo", Age = 20 });

2) Delete

1 //delete 2 dbContext.Delete<Student>(s => s.Id.Equals(newId));

3) Change

1 //update 2 dbContext.Update<Student>(s => new { s.StudentName }, new Student { StudentName = name }) //update fields 3 .Where(s => s.Id.Equals(newId)) //where condition 4 .Done();

4) Check

1 //select 2 dbContext.Queryable<Student>() 3 .Select(s => new { s.StudentName, s.Id }) //Field Query 4 .Where(s => s.Age > 10 && s.StudentName.Equals(name)) //where condition 5 .Paging(1, 10) //Paging Query 6 .ToList();

3. Native sql operation, weak type, storage of non-entity class fields=>indexer

The lambda expression described above works only with single-table queries, and I prefer native sql queries if joined multiple-table queries or more complex queries are required.Native sql queries, of course, also provide the ability to map results to objects, similar to Dapper's use.

The specific operations are as follows:

1 //Native sql Query Usage 2 string complexSQL = "select xxx from t_xxx A,t_yyy B where xx = and yy = "; 3 object[] sqlParams = new Object[2] { DateTime.Now, 2 }; 4 var resList = dbContext.Queryable(complexSQL, sqlParams).ToList<Student>();

For a slight explanation of the above code, and in complexSQL are placeholders for parameterized query parameters. In native sql query usage, you can use any basic variable of c# to fill the placeholder and ultimately get the effect of sql parameterized query, which is very convenient to use.

Of course, you may be confused: What if the results of my query contain values that are not fields in the Student object?

The answer is that you can index the values of fields in non-Student objects. Here's an example. If you want to find the xxx field, you can do this:

1 Student resList1 = resList[0]; 2 var segmentValue = (string)resList1["xxx"]; //Any field value in the query result can be indexed, remember to convert to the actual type of field

So Easy!

4. More complex sql logic, using stored procedures

Business logic of manufacturing enterprises cannot be separated from the process of adding forms. Query and data writing of forms belong to more complex sql.

But it's not really much. What's more, because of the prevalent big data concepts, the managers of the workshop often want to be contaminated, so they will show Kanban of various data statistics on the large screen of the workshop, which involves a large number of tables of query logic.Once developed a Kanban function, specifically counted, a single sql is close to 200 lines of code, no exaggeration.Complex queries like this may also involve scenarios where different SQLs are executed according to different conditions. Of course, I won't stitch native SQLs in advanced languages because it's easy to get confused.This is the time to sacrifice the last great killing technique, the stored procedure.

You can use stored procedures like this in CoffeeSQL:

1 using (ShowYouDbContext dbContext = new ShowYouDbContext()) 2 { 3 string strsp = "xxx_storeprocedureName"; 4 5 OracleParameter[] paras = new OracleParameter[] 6 { 7 new OracleParameter("V_DATE",OracleDbType.Varchar2,50), 8 new OracleParameter("V_SITE",OracleDbType.Varchar2,50), 9 new OracleParameter("V_SCREEN",OracleDbType.Varchar2,50), 10 new OracleParameter("V_SHIFT",OracleDbType.Varchar2,50), 11 new OracleParameter("V_CURSOR",OracleDbType.RefCursor)
13 }; 14 15 paras[0].Value = info.date; 16 paras[1].Value = info.site; 17 paras[2].Value = info.screenKey; 18 paras[3].Value = info.shift; 19 paras[4].Direction = ParameterDirection.Output;21 22 DataSet ds = dbContext.StoredProcedureQueryable(strsp, paras).ToDataSet(); 23 24 return ds; 25 }

The usage here does not wrap anything, it is worth mentioning that the query results can be converted to objects.Of course, this is a major feature throughout the ORM framework, everywhere.

5. Database connection management, one master and many slaves

Most databases are not single deployed nowadays. The most popular way to deploy databases is "one master and many slaves". That is, one database is used as the database to write data (master database), and other databases are used as the database to read data (slave database) to synchronize the data of the master database. This enables read-write separation and reduces access to the master database.Pressure greatly improves database access performance.Of course, the ORM framework we are discussing now naturally supports database operations in this "one master, many slaves" database deployment.

The specific configurations for "one master, many slaves" database connections are as follows:

1 public class ShowYouDbContext : OracleDbContext<ShowYouDbContext> 2 { 3 private static string writeConnStr = "xxxxxxx_1"; 4 private static string[] readConnStrs = new string[] { 5 6 "xxxxxxxxx_2", 7 "xxxxxxxxx_3", 8 "xxxxxxxxx_4", 9 "xxxxxxxxx_5", 10 "xxxxxxxxx_6" 11 }; 12 13 public ShowYouDbContext() : base(writeConnStr, readConnStrs) 14 { 15 } 16 }

When constructing a DBContext class object, pass in the database connection string as a construction parameter. The first is the connection string for the primary database (write database), and the next is the connection string from the library (read database).(

6. Entity data validation as a defense against data format specifications

The concept of entity data validation is well understood: each field in a data table has its own length, size, and other constraints, so each entity corresponding to the data table should also have corresponding field constraints as a defense against data format specifications, so that data persisted to the data table conforms to its specified format, which is equivalent to goods doing one before entering the warehouseSecurity checks.

You can configure data validation rules for an entity class like this:

1 [Table("B_DICTIONARY")] 2 public class Dictionary : EntityCommon 3 { 4 [Length(1,50)] 5 [Column] 6 public string Name { get; set; } 7 [Column] 8 public string Value { get; set; } 9 [Column] 10 public string Type { get; set; } 11 }

The Name field in this entity class identifies a LengthAttribute tag, specifying that the length of the field ranges from 1 to 50, and throws an exception if it does not conform.Of course, there is more than one validation rule for entity data, which can be added later on or extended by users according to their own needs.

7. Transaction's operation form, personal habits, like to write out the transaction clearly, do not like over-encapsulation

Transactions are an important concept in database systems. Transactions are characterized by ACID (Atomicity, Consistency, Isolation, Persistence). Of course, the concept of transactions will not be discussed here. What I want to show you is how to use transactions in CoffeeSql:

1 try 2 { 3 dbContext.DBTransaction.Begin(); 4 5 dbContext.Update<Machine_Match_Relation>(d => new { d.Del_Flag }, new Machine_Match_Relation { Del_Flag = 1 })
.Where(d => d.Screen_Machine_Id.Equals(displayDeviceId)).Done(); 6 7 foreach(string bindDeviceId in bindDeviceIds) 8 { 9 dbContext.Add(new Machine_Match_Relation 10 { 11 Id = Utils.GetGuidStr(), 12 Screen_Machine_Id = displayDeviceId, 13 Machine_Id = bindDeviceId, 14 Creater = updater 15 }); 16 } 17 18 dbContext.DBTransaction.Commit(); 19 } 20 catch(Exception ex) 21 { 22 dbContext.DBTransaction.Rollback(); 23 throw ex; 24 }

Some people will say why you don't encapsulate a method here. Just pass in a delegate Action from your business operation code. Of course, but to be honest, it's not necessary. Encapsulate it yourself if you like.

8. Adaptable to expand multiple different databases

As a trendy ORM, of course, it has to be adapted to a variety of databases, such as Oracle, Mysql, SqlServer, and so on.

1 //Mysql 2 public class WQSDbContext : MysqlDbContext<WQSDbContext> 3 { 4 public WQSDbContext() : base(mysqlConnStr) 5 { 6 this.OpenQueryCache = false; 7 this.Log = context => 8 { 9 Console.WriteLine($"sql:"); 10 Console.WriteLine($"time:"); 11 }; 12 } 13 } 14 15 //Oracle 16 public class WQSDbContext : OracleDbContext<WQSDbContext> 17 { 18 public WQSDbContext() : base(oracleConnStr) 19 { 20 this.OpenQueryCache = false; 21 this.Log = context => 22 { 23 Console.WriteLine($"sql:"); 24 Console.WriteLine($"time:"); 25 }; 26 } 27 }

Currently, CoffeeSQL implements two database extensions, Oracle and Mysql.Of course, if you want to adapt to more databases, you can also try to extend them, but these are the two databases I'm currently using.

9. Cache function to improve performance

Remember that during an interview for a college enrollment, the interviewer asked me, "Do you think ORM is fast or not?Ado.net Fast?"

I blurted out foolishly:'Of course it isAdo.netFaster, less steps to convert linq statements to sql than ORM, and ORM is also faster than using it directlyAdo.netMore steps to map query results to objects."

The interviewer looks at me for three seconds and then says:

"OK, that's it today. Go back and wait for the announcement!"

The consequences of such notifications are also evident...

Until I learned more about the principles of the ORM framework, that question wasn't as simple as what I answered because I ignored the ORM cache.

There will also be ORM caches in CoffeeSql, which are divided into table caches (secondary caches) and sql statement caches (primary caches):Table caches are generally used for tables with small amounts of data and frequently perform query operations, caching the entire table's data into the cache media; sql statement caches can be used for various types of tables, which use sql query statements as cache keys.

Of course, if you don't want to know too much about it, you just need to configure it as simple as this:

(ORM Cache Switch)

1 public class WQSDbContext : OracleDbContext<WQSDbContext> 2 { 3 public WQSDbContext() : base(oracleConnStr) 4 { 5 this.OpenTableCache = true; 6 this.OpenQueryCache = true; 7 } 8 }

(Entity configuration for table cache)

1 [TableCaching] 2 [Table("B_DICTIONARY")] 3 public class Dictionary : EntityCommon 4 { 5 [Column] 6 public string Name { get; set; } 7 [Column] 8 public string Value { get; set; } 9 [Column] 10 public string Type { get; set; } 11 }

If TableCachingAttribute is marked on the entity class and table caching is turned on, the current table will be cached for full table data.

Keep in mind that table caching is generally only used on tables with small amounts of data and frequent queries.Since table caching is enabled to scan the entire table's data in advance and store it in the local cache, and query the table's data directly in the cache, if you open table caching on a table that has a large amount of data and does not query very frequently, you can imagine that it will certainly be a reward.

The above code manipulation was what I had expected for CaoffeeSQL, and of course it is now a reality.The above content is actually equivalent to the CoffeeSQL operation manual, because the code above is entirely displayed in accordance with the actual framework operation of CoffeeSQL.

More detailed usage details about CoffeeSQL will be provided in the source code test code, and viewers can step over to see the source code:

https://gitee.com/xiaosen123/CoffeeSqlORM

This article was originally created by the author and should be reproduced from: https://www.cnblogs.com/MaMaNongNong/p/12896757.html

3 June 2020, 22:04 | Views: 4803

Add new comment

For adding a comment, please log in
or create account

0 comments