In addition to Weibo, there is also WeChat
Please pay attention
WeChat public account
Shulou
2025-02-28 Update From: SLTechnology News&Howtos shulou NAV: SLTechnology News&Howtos > Database >
Share
Shulou(Shulou.com)06/01 Report--
Individuals are accustomed to modeling with MySQL workbench EER data, and then generate SQL statements for execution in the database, so that the relationship between tables is more intuitive.
Like this:
Draw a picture
Forward project, generate DDL statement:
Ignore the generation of foreign keys and foreign key indexes:
The generated DDL statement:
Execute to the database. Step on the pit.
Recently, the team fine-tuned, I was adjusted to another small team. Two days ago, I received a new requirement, so I still used MySQL workbench EER to model, but I was told that the database used for this project was PostgreSQL!
So we are faced with the following choices:
Find a new modeling software that supports exporting PostgreSQL DDL statements and do it again. As far as I know, there is no good data modeling software in macOS platform. PowerDesigner can't be used (unless a virtual machine is installed, or Wine); Navicat is too difficult to use (some people actually say that Navicat is the best database client, I can only give an uppercase server, in my opinion, this product is not even comparable to IDEA's own database management. This view may be a little extreme, but the current situation is that I make a query, and Navicat hides the query button very deeply); IDEA announced that it will develop similar features, but there has been no movement; open source PDMan, the experience is good, but also need a database-controlled version. Still export DDL with MySQL workbench, and then convert MySQL DDL to PostgreSQL DDL yourself.
I chose to convert the SQL statement myself.
Open source DDL conversion tool
Since we want to convert SQL statements, I thought to myself, there must be related tools in the industry. So the omnipotent GayHub searched it, and there was it, and listed it:
Mysql-to-postgres:mysql-postgresql-converter: many tools are used together: (I have to admire how patient this brother is! )
However, after the trial, the heart is broken. The generated DDL is either incorrect or has no comments.
Develop your own tools
Considering that my request is actually very simple, it's just a DDL statement conversion, and it's not difficult to develop one by yourself. And when I studied the Mybatis general Mapper source code before, I knew that there was a jsqlparser tool in the Java world.
After a brief understanding of jsqlparser in 10 minutes, I started to play with development tools. It took 20 minutes to finish the first version, and then another 20 minutes with colleagues in the project to verify it, and finally decided on the following version. The code is posted:
Add dependence:
Com.github.jsqlparser jsqlparser 1.2
Write code:
Public class MysqlDdl2PgDdlUtil {public static void main (String [] args) throws IOException, JSQLParserException {/ / your MySQL DDL path String mysqlDDLPath = "/ Users/reno/Downloads/mysql.sql"; String dDLs = FileUtils.readFileToString (new File (mysqlDDLPath)); System.out.println (dDLs); System.out.println ("+ start conversion SQL statement +"); Statements statements = CCJSqlParserUtil.parseStatements (dDLs) Statements.getStatements () .stream () .map (statement-> (CreateTable) statement) .forEach (ct-> {Table table = ct.getTable (); List columnDefinitions = ct.getColumnDefinitions (); List comments = new ArrayList ()) List collect = columnDefinitions.stream () .peek (columnDefinition-> {List columnSpecStrings = columnDefinition.getColumnSpecStrings (); int commentIndex = getCommentIndex (columnSpecStrings); if (commentIndex! =-1) {int commentStringIndex = commentIndex + 1) String commentString = columnSpecStrings.get (commentStringIndex); String commentSql = genCommentSql (table.toString (), columnDefinition.getColumnName (), commentString); comments.add (commentSql); columnSpecStrings.remove (commentStringIndex); columnSpecStrings.remove (commentIndex) } columnDefinition.setColumnSpecStrings (columnSpecStrings);}) .Collectors.toList (); ct.setColumnDefinitions (collect) String createSQL = ct.toString () .replaceAll ("`", "\") .replaceAll ("BIGINT UNIQUE NOT NULL AUTO_INCREMENT", "BIGSERIAL PRIMARY KEY") .replaceAll ("BIGINT NULL AUTO_INCREMENT", "BIGSERIAL PRIMARY KEY") .replaceAll ("BIGINT NOT NULL AUTO_INCREMENT") "BIGSERIAL PRIMARY KEY") .replaceAll ("INT NOT NULL AUTO_INCREMENT", "BIGSERIAL PRIMARY KEY") .replaceAll ("INT NULL AUTO_INCREMENT", "BIGSERIAL PRIMARY KEY") .replaceAll ("IF NOT EXISTS", ") .replaceAll (" TINYINT "," SMALLINT ") .replaceAll (" DATETIME " "TIMESTAMP") .replaceAll (", PRIMARY KEY\\ (\" id\ "\)", ") / / if there is a table annotation if (createSQL.contains ("COMMENT")) {createSQL = createSQL.substring (0, createSQL.indexOf ("COMMENT"));} System.out.println (createSQL + ";"); comments.forEach (t-> System.out.println (t.replaceAll ("`", "\") + ";")) });} / * get the subscript * * @ param columnSpecStrings columnSpecStrings * @ return subscript * / private static int getCommentIndex (List columnSpecStrings) {for (int I = 0; I < columnSpecStrings.size (); iSue +) {if ("COMMENT" .equals IgnoreCase (columnSpecStrings.get (I) {return I }} return-1 } / * generate COMMENT statement * * @ param table table name * @ param column field name * @ param commentValue description text * @ return COMMENT statement * / private static String genCommentSql (String table, String column, String commentValue) {return String.format ("COMMENT ON COLUMN% s% s IS% s", table, column, commentValue);}}
As shown in the code, PostgreSQL is currently generated using jsqlparser's SQL parsing capabilities in conjunction with string substitution.
Effect demonstration
DDL before conversion:
-Table `user`-CREATE TABLE IF NOT EXISTS `user` (`id` INT NOT NULL AUTO_INCREMENT COMMENT 'id' `username` VARCHAR (16) NOT NULL COMMENT 'username', `email` VARCHAR (255) NULL COMMENT 'mail', `password` VARCHAR (32) NOT NULL COMMENT 'password', `create_ time`TIMESTAMP NULL DEFAULT CURRENT_TIMESTAMP COMMENT 'creation time', PRIMARY KEY (`id`) -Table `movie`-CREATE TABLE IF NOT EXISTS `movie` (`id`INT NOT NULL AUTO_INCREMENT COMMENT 'Id' `name` VARCHAR (255) NOT NULL COMMENT 'name', `name`INT NOT NULL COMMENT 'user.id', PRIMARY KEY (`id`) COMMENT =' movie table'
Converted DDL:
CREATE TABLE "user" ("id" BIGSERIAL PRIMARY KEY, "username" VARCHAR (16) NOT NULL, "email" VARCHAR (255th) NULL, "password" VARCHAR (32) NOT NULL, "create_time" TIMESTAMP NULL DEFAULT CURRENT_TIMESTAMP); COMMENT ON COLUMN "user". "id" IS 'id';COMMENT ON COLUMN "user". "username" IS' user name'; COMMENT ON COLUMN "user". "email" IS 'email' COMMENT ON COLUMN "user". "password" IS 'password'; "COMMENT ON COLUMN" user "." create_time "IS 'creation time"; CREATE TABLE "movie" ("id" BIGSERIAL PRIMARY KEY, "name" VARCHAR (255) NOT NULL, "user_id" INT NOT NULL); COMMENT ON COLUMN "movie". "id" IS' Id';COMMENT ON COLUMN "movie". "name" IS 'name "; COMMENT ON COLUMN" movie "." user_id "IS' user.id'
The effect is still good, basically meet my requirements.
Deficiency
At present, the tool code is rather rubbish. If you want to improve it, you should make the tool understand the morphology of MySQL DDL, and then build it into objects such as Table, Column, Comment, Constraint, Index and so on.
Class Table {private String name; private Column column;} class Column {private String name; private String type; / / constraints, such as non-empty private Set constraints; / / index private Index index;} class Index {private String name; private String type;} enum Constraint {NOT_NULL,...;}
Then abstract an enumeration of dialects, and make a DDL Generator Handler for different dialects, and then generate DDL statements for different database platforms according to different dialects.
Why not improve it? Because there is no time, the tool is for the work, at present can achieve my goal, there is no motivation to modify, there is a need to improve it in the future.
Original text
Http://www.itmuch.com/work/mysql-ddl-2-pgsql-ddl/, reprint, please state the source.
Practical information sharing
Welcome to subscribe "Shulou Technology Information " to get latest news, interesting things and hot topics in the IT industry, and controls the hottest and latest Internet news, technology news and IT industry trends.
Views: 0
*The comments in the above article only represent the author's personal views and do not represent the views and positions of this website. If you have more insights, please feel free to contribute and share.
Continue with the installation of the previous hadoop.First, install zookooper1. Decompress zookoope
"Every 5-10 years, there's a rare product, a really special, very unusual product that's the most un
© 2024 shulou.com SLNews company. All rights reserved.