Skip to main content

The Insert Operation

The select operation was quite a journey, and there is a lot of classes that I will reuse in this article. If you haven't looked at it yet, it would be the ideal place to start.

The insert operation will typically return either:

  1. Single record
  2. Error (like no record found)

In the interest of keeping the result compact, a good strategy would be to merge (1) and (2) into a single class

interface<T> InsertResult {  T result();  String error();}

The result in the InsertResult object would be a pretty much the same entity being persisted with the exception that the result would also include the primary key value if one was generated - either by the database (through auto_generated keys) or by the application (using a natural unique identifier like a state id number for example).

The insert function pseudocode from a high level can then be reduced to something like this:

insert operation
function T insertOperation(String query, T entity){  try(Connection conn = borrowConnection()){    try(PreparedStatement ps = conn.prepareStatement(query, String[] columnNames, Statement.RETURN_GENERATED_KEYS)){      EntityMetadata metadata = entity.metadata()      Object[] paramaters = metadata.resolveInsertParameters(query, entity)
      for(int i = 0; i < parameters.length; i++){        ps.setObject(i + 1, parameters[i])      }            int insertedRows = statement.executeUpdate();      if( insertedRows > 0){        try (ResultSet keys = statement.getGeneratedKeys()) {          if(rs.next()){            String generatedColumnName = metadata.resolveGeneratedColumnName()            Class<?> generatedColumnType = metadata.resolveGeneratedColumnType(generatedColumnName)            entity.set(generatedColumnName, rs.getObject(generatedColumnName, generatedColumnType))          }          return InsertResult.success(entity)        }        catch(Exception e){          return InsertResult.failure(e)        }      }    }    else{        return InsertResult.failure("No new record was created")    }    catch(Exception e){      return InsertResult.failure(e)    }  }  catch(Exception e){    return InsertResult.failure(e)  }}

Entity with no relationships#

Visit Github Resources

This pseudocode looks simply enough, but it is not quite that simple. Why so? The short answer is that, as the case is with select, relationships just make things complicated. For the longer answer, stay with me. For a more concrete discussion, I will use a Task entity to illustrate the concepts.

DROP TABLE PUBLIC.TBL_TASK;
CREATE TABLE PUBLIC.TBL_TASK (    ID UUID(16) DEFAULT RANDOM_UUID() NOT NULL,    NAME VARCHAR(50) NOT NULL,    DONE BOOLEAN DEFAULT FALSE NOT NULL,    TASK_CREATED TIMESTAMP DEFAULT NOW() NOT NULL,    CONSTRAINT TASK_PK PRIMARY KEY (ID));CREATE UNIQUE INDEX PRIMARY_KEY_4 ON PUBLIC.TBL_TASK (ID);CREATE UNIQUE INDEX UNIQ_NAME_INDEX_4 ON PUBLIC.TBL_TASK (NAME);

The corresponding entity class would look like this

class Task {  UUID id()  String name()  Boolean completed()  LocalDate dateCreated()}

The insert handler has two dynamic parts, while the rest of the code is much like the guard rails around it. The complexity of the insert operation (for this iteration) can be reduced to considering these dynamic parts

//the query parametersEntityMetadata metadata = entity.metadata()Object[] paramaters = metadata.resolveInsertParameters(query, entity)
for(int i = 0; i < parameters.length; i++){    ps.setObject(i + 1, parameters[i])}
//and 
//the result set rowsint insertedRows = ps.executeUpdate();if(insertedRows > 0) {    try (ResultSet rs = ps.getGeneratedKeys()) {        if (rs.next()) {            String generatedColumnName = metadata.resolveGeneratedColumnName();            Class<?> generatedColumnType = metadata.resolveGeneratedColumnType(generatedColumnName);            entity.set(generatedColumnName, rs.getObject(generatedColumnName, generatedColumnType));        }        return InsertResult.success(entity);    } catch (Exception e) {        return InsertResult.failure(e);    }}

The ColumnInfo we have been using needs an new attribute to indicate that a column's value is auto-generated.

Boolean isAutoGenerated;

The Entity class will now also need an additional get() function to read a value from the entity instance.

interface Entity {    EntityMetadata metadata();    <T> void set(String attribute, T value);    Object get(String attribute);}

The Task implementation of the Entity interface would look like this:

class Task extends HashMap<String, Object> implements Entity {
    public Task() {    }
    public Task(String name, Boolean completed) {        set("name", name);        set("completed", completed);        set("dateCreated", LocalDate.now());    }
    @Override    public EntityMetadata metadata() {        return EntityRegistry.registry.get(Task.class);    }
    @Override    public <T> void set(String attribute, T value) {        super.put(attribute, value);    }
    @Override    public Object get(String attribute) {        return super.get(attribute);    }}

The EntityRegistry with the Task's metadata looks like this:

class EntityRegistry {
    public static final Map<Class<? extends Entity>, EntityMetadata> registry = new HashMap<>();
    static {        //Task Entity        registry.put(Task.class, new EntityMetadata("tbl_task", Set.of(                new ColumnInfo("id", "id", UUID.class, true),                new ColumnInfo("name", "name", String.class, false),                new ColumnInfo("done", "completed", Boolean.class, false),                new ColumnInfo("task_created", "dateCreated", LocalDate.class, false)        )) {            @Override            public Task entityInstance() {                return new Task();            }        });    }}

And the EntityMedata again provides the perfect place to add the resolving functions needed in the insert() function

public Optional<ColumnInfo> resolveByColumnName(String columnName) {    return columns.stream()            .filter(info -> info.columnName != null)            .filter(info -> info.columnName.equals(columnName))            .findFirst();}
public Object[] resolveQueryParameters(String[] columnNames, Entity entity) {    Object[] parameters = new Object[columnNames.length];    for (int i = 0; i < parameters.length; i++) {        String columnName = columnNames[i];        String attributeName = resolveByColumnName(columnName).orElseThrow(() ->                new EntityMappingException(String.format("There is no mapping for column %s in the metadata", columnName))).attributeName;        parameters[i] = entity.get(attributeName);    }    return parameters;}
public String resolveGeneratedColumnName() {    Optional<ColumnInfo> optional = columns.stream().filter(info -> info.isAutoGenerated).findFirst();    if (optional.isPresent()) {        return optional.get().columnName;    }    throw new EntityMappingException("There is no auto-generated column defined in the metadata");}
public Class<?> resolveGeneratedColumnType(String columnName) {    return resolveByColumnName(columnName).orElseThrow(() ->            new EntityMappingException(String.format("There is no mapping for column %s in the metadata", columnName))).attributeType;}

For the most basic use-case, the finished Insert class for this scenario will look like this:

class Insert {
    private static final String connectionString = "jdbc:h2:./data/sample-i_1.db";
    public static void main(String[] args) {        Task taskEntity = new Task("Crazy jacks", false);        InsertResult<Task> task = insert("insert into tbl_task (name) values (?)", new String[]{"name"}, taskEntity);        if (task.error != null) {            System.out.println(task.error);        } else {            System.out.println(task.result);        }    }
    public static <T extends Entity> InsertResult<T> insert(String query, String[] columnNames, T entity) {        try (Connection conn = DriverManager.getConnection(connectionString)) {            try (PreparedStatement ps = conn.prepareStatement(query, Statement.RETURN_GENERATED_KEYS)) {                EntityMetadata metadata = entity.metadata();                Object[] parameters = metadata.resolveQueryParameters(columnNames, entity);
                for (int i = 0; i < parameters.length; i++) {                    ps.setObject(i + 1, parameters[i]);                }
                int insertedRows = ps.executeUpdate();                if(insertedRows > 0) {                    try (ResultSet rs = ps.getGeneratedKeys()) {                        if (rs.next()) {                            String generatedColumnName = metadata.resolveGeneratedColumnName();                            Class<?> generatedColumnType = metadata.resolveGeneratedColumnType(generatedColumnName);                            entity.set(generatedColumnName, rs.getObject(generatedColumnName, generatedColumnType));                        }                        return InsertResult.success(entity);                    } catch (Exception e) {                        return InsertResult.failure(e);                    }                }                else{                    return InsertResult.failure(new InsertOpException("No new record was created"));                }            } catch (Exception e) {                return InsertResult.failure(e);            }        } catch (Exception e) {            return InsertResult.failure(e);        }    }}

Entity with an embedded association#

Visit Github Resources

So we've managed to get away with the bare minimum requirements to persist an entity which had no relationships. We churned out a lot of boiler plate code which will be reused and improved upon as we evaluate other scenarios. In this section, we'll look at a scenario where the entity has grouped together come table columns under a single embedded entity.

DROP TABLE PUBLIC.TBL_CUSTOMER;
CREATE TABLE PUBLIC.TBL_CUSTOMER (    MEMBER_ID UUID(16) DEFAULT RANDOM_UUID() NOT NULL,    FIRST_NAME VARCHAR(50) NOT NULL,    LAST_NAME VARCHAR(50) NOT NULL,    DATE_JOINED TIMESTAMP DEFAULT NOW() NOT NULL,    MEMBER_LEVEL VARCHAR DEFAULT 'SILVER',    ADDR_CITY VARCHAR(50),    ADDR_STATE VARCHAR(30),    ADDR_POSTAL_CODE VARCHAR(10),    CONSTRAINT CUSTOMER_PK PRIMARY KEY (MEMBER_ID));CREATE UNIQUE INDEX PRIMARY_KEY_C ON PUBLIC.TBL_CUSTOMER (MEMBER_ID);

The corresponsing entity classes would look like this:

Address class

class Address implements Entity {    String city;    String state;    String zipCode;

Custmer class

class Customer implements Entity {    UUID id;    String firstName;    String lastName;    Level level;    Address address;    LocalDate dateJoined;

In this scenario, we'll push the metadata definition to a dedicated class, the EntityRegistry. This bears a lot of similarities from the same class use used while evaluating the select operation

EntityRegistry

class EntityRegistry {
    public static final Map<Class<? extends Entity>, EntityMetadata> registry = new HashMap<>();
    static {        //Address Entity        registry.put(Address.class, new EntityMetadata(Address.class, null, Set.of(                ColumnInfo.builder().columnName("addr_city").attributeName("city").build(),                ColumnInfo.builder().columnName("addr_state").attributeName("state").build(),                ColumnInfo.builder().columnName("addr_postal_code").attributeName("zipCode").build()        ), EntityType.EMBEDDABLE) {
            @Override            public Address entityInstance() {                return new Address();            }        });
        //Customer Entity        registry.put(Customer.class, new EntityMetadata(Customer.class, "tbl_customer", Set.of(                ColumnInfo.builder().columnName("member_id").attributeName("id").isAutoGenerated(true).attributeType(UUID.class).build(),                ColumnInfo.builder().columnName("first_name").attributeName("firstName").build(),                ColumnInfo.builder().columnName("last_name").attributeName("lastName").build(),                ColumnInfo.builder().columnName("member_level").attributeName("level").attributeType(Level.class).isEnum(true).build(),                ColumnInfo.builder().attributeName("address").attributeType(Address.class).isEmbedded(true).build(),                ColumnInfo.builder().columnName("date_joined").attributeName("dateJoined").attributeType(LocalDate.class).build()        )) {
            @Override            public Customer entityInstance() {                return new Customer();            }        });    }}

One big change will be in the EntityMetadata class. We will remove some existing functions and use a few new improved ones.

//... same as beforepublic abstract <T extends Entity> T entityInstance();
public Optional<ColumnInfo> resolveByColumnName(String columnName) {    return columns.stream()            .filter(info -> info.columnName != null)            .filter(info -> info.columnName.equals(columnName))            .findFirst();}
public ColumnInfo getGeneratedColumn() {    Optional<ColumnInfo> optional = columns.stream().filter(info -> info.isAutoGenerated).findFirst();    if (optional.isPresent()) {        return optional.get();    }    throw new EntityMappingException("There is no auto-generated column defined in the metadata");}
public boolean containsEmbedded() {    return columns.stream().anyMatch(info -> info.isEmbedded);}
public List<ColumnInfo> embeddedColumns() {    return columns.stream().filter(info -> info.isEmbedded).collect(Collectors.toList());}
public void resolveInsertableColumns(Entity entity, Map<String, Object> map) {    EntityMetadata entityMetadata = entity.metadata();    if (entityMetadata.containsEmbedded()) {        for (ColumnInfo embeddedColumn : embeddedColumns()) {            Entity embeddedValue = (Entity) entity.get(embeddedColumn.attributeName);            resolveInsertableColumns(embeddedValue, map);        }    }    entityMetadata.columns.stream()            .filter(info -> !info.isAutoGenerated)            .filter(info -> info.columnName != null)            .forEach(info -> {                map.put(info.columnName, entity.get(info.attributeName));            });}

The other big change will be in the Insert class. Here, the insert method will just take an Entity instance, an then infer the rest of the information it needs, making use of the improved functions in the EntityMetatda class

Insert::insert

public static <T extends Entity> InsertResult<T> insert(T entity) {    try (Connection conn = DriverManager.getConnection(connectionString)) {        EntityMetadata metadata = entity.metadata();        Map<String, Object> insertableColumns = new LinkedHashMap<>(); //be sure to use "linked" version of hashmap to retain order        resolveInsertableColumns(entity, insertableColumns);        String[] columnNames = insertableColumns.keySet().toArray(new String[0]);        String insertQuery = metadata.resolveInsertQuery(columnNames);
        try (PreparedStatement ps = conn.prepareStatement(insertQuery, Statement.RETURN_GENERATED_KEYS)) {            Object[] parameters = insertableColumns.values().toArray(new Object[0]);
            for (int i = 0; i < parameters.length; i++) {                ps.setObject(i + 1, parameters[i]);            }
            int insertedRows = ps.executeUpdate();            if(insertedRows > 0) {                try (ResultSet rs = ps.getGeneratedKeys()) {                    if (rs.next()) {                        ColumnInfo generatedColumn = metadata.getGeneratedColumn();                        Class<?> generatedColumnType = generatedColumn.attributeType;                        entity.set(generatedColumn.attributeName, rs.getObject(generatedColumn.columnName, generatedColumnType));                    }                    return InsertResult.success(entity);                } catch (Exception e) {                    return InsertResult.failure(e);                }            }            else{                return InsertResult.failure(new InsertOpException("No new record was created"));            }        } catch (Exception e) {            return InsertResult.failure(e);        }    } catch (Exception e) {        return InsertResult.failure(e);    }}
private static void resolveInsertableColumns(Entity entity, Map<String, Object> map) {    EntityMetadata entityMetadata = entity.metadata();    if (entityMetadata.containsEmbedded()) {        for (ColumnInfo embeddedColumn : entityMetadata.embeddedColumns()) {            Entity embeddedValue = (Entity) entity.get(embeddedColumn.attributeName);            resolveInsertableColumns(embeddedValue, map);        }    }    entityMetadata.columns.stream()            .filter(info -> !info.isAutoGenerated)            .filter(info -> info.columnName != null)            .forEach(info -> {                map.put(info.columnName, entity.get(info.attributeName));            });}

With the changes in place, the last thing is to fire up the insert example

public static void main(String[] args) {    Address address = new Address("chicago", "IL", "60606");    Customer customerEntity = new Customer(null, "James", "Makala", Level.GOLD, address, LocalDate.now());    InsertResult<Customer> task = insert(customerEntity);    if (task.error != null) {        System.out.println(task.error);    } else {        System.out.println(task.result);    }}

Table with a composite primary key#

Visit Github Resources

In this scenario, the able has a compisite primary key;

DROP TABLE PUBLIC.TBL_USER;
CREATE TABLE PUBLIC.TBL_USER (    EMAIL_ADDRESS VARCHAR(100) NOT NULL,    USERNAME VARCHAR(50) NOT NULL,    NICKNAME VARCHAR(50),    DATE_JOINED TIMESTAMP DEFAULT NOW() NOT NULL,    ACCESS_LEVEL VARCHAR DEFAULT 'VISITOR',    ADDR_CITY VARCHAR(50),    ADDR_STATE VARCHAR(30),    ADDR_POSTAL_CODE VARCHAR(10),    CONSTRAINT CUSTOMER_PK PRIMARY KEY (EMAIL_ADDRESS,USERNAME));CREATE UNIQUE INDEX PRIMARY_KEY_4 ON PUBLIC.TBL_USER (EMAIL_ADDRESS,USERNAME);

The corresponding entity representation will look like this

UserId

class UserId implements Entity {
    String emailAddress;    String userName;
    //...same as before
    @Override    public Object get(String attribute) {        switch (attribute) {            case "emailAddress":                return this.emailAddress;            case "userName":                return this.userName;            default:                LOG.warning("what the heck are you doing?");                throw new EntityMappingException(String.format("There is no attribute in the entity with the name %s", attribute));        }    }}

User

class User implements Entity {
    UserId userId;    String nickName;    AccessLevel accessLevel;    Address address;    LocalDate dateJoined;
    //...same as before
    @Override    public Object get(String attribute) {        switch (attribute) {            case "userId":                return this.userId;            case "nickName":                return this.nickName;            case "accessLevel":                return this.accessLevel;            case "address":                return this.address;            case "dateJoined":                return this.dateJoined;            default:                LOG.warning("what the heck are you doing?");                throw new EntityMappingException(String.format("There is no attribute in the entity with the name %s", attribute));        }    }}

The UserId is of Entity type, so this needs to be captured in the EntityType enum

EnttiyType enum

public enum EntityType {    IDENTIFIABLE,    EMBEDDABLE,    COMPOSITE_PK}

The UserId and User metadata will look like this

UserId metadata

registry.put(UserId.class, new EntityMetadata(UserId.class, null, Set.of(    ColumnInfo.builder().columnName("email_address").attributeName("emailAddress").build(),    ColumnInfo.builder().columnName("username").attributeName("userName").build()), EntityType.COMPOSITE_PK) {
    @Override    public UserId entityInstance() {        return new UserId();    }});

User metadata

registry.put(User.class, new EntityMetadata(User.class, "tbl_user", Set.of(    ColumnInfo.builder().attributeName("userId").attributeType(UserId.class).isCompositePk(true).build(),    ColumnInfo.builder().columnName("nickname").attributeName("nickName").build(),    ColumnInfo.builder().columnName("access_level").attributeName("accessLevel").attributeType(AccessLevel.class).isEnum(true).isEnumOrdinal(true).build(),    ColumnInfo.builder().attributeName("address").attributeType(Address.class).isEmbedded(true).build(),    ColumnInfo.builder().columnName("date_joined").attributeName("dateJoined").attributeType(LocalDate.class).build())) {
    @Override    public User entityInstance() {        return new User();    }});

At this juncture, we need to update the ColumnInfo to reflect this new scenario. I will also throw in an additional attribute to allow storing an enum value as its ordinal position in the enumeration. The ColumnInfoBuilder should be updated to include these two attributes as well.

Boolean isEnumOrdinal;Boolean isCompositePk;

One final change will be in the Insert class where we have to accomodate this new scenario

private static void resolveInsertableColumns(Entity entity, Map<String, Object> map) {    EntityMetadata entityMetadata = entity.metadata();    if (entityMetadata.containsEmbedded()) {        for (ColumnInfo embeddedColumn : entityMetadata.embeddedColumns()) {            Entity embeddedValue = (Entity) entity.get(embeddedColumn.attributeName);            resolveInsertableColumns(embeddedValue, map);        }    }    if (entityMetadata.containsCompositePk()) {        ColumnInfo compositeKeyColumn = entityMetadata.compositePkColumn();        Entity compositeKey = (Entity) entity.get(compositeKeyColumn.attributeName);        resolveInsertableColumns(compositeKey, map);    }    entityMetadata.columns.stream()        .filter(info -> !info.isAutoGenerated)        .filter(info -> info.columnName != null)        .forEach(info -> {            if(info.isEnum){                if(info.isEnumOrdinal) {                    map.put(info.columnName, ((Enum<?>) entity.get(info.attributeName)).ordinal());                }                else{                    map.put(info.columnName, entity.get(info.attributeName).toString());                }            }            else {                map.put(info.columnName, entity.get(info.attributeName));            }        });}

With those changes in place, the last thing is to fire up the insert operation

public static void main(String[] args) {    UserId userId = new UserId("some5@email.com", "poozi");    Address address = new Address("chicago", "IL", "60606");    User userEntity = new User(userId, "nicki", AccessLevel.USER, address, LocalDate.now());    InsertResult<User> user = insert(userEntity);    if (user.error != null) {        System.out.println(user.error);    } else {        System.out.println(user.result);    }}

Table with 1-to-1 relationship#

Visit Github Resources

We have considered inserting records with no relationships so far, and it has been soooth-sailing. Let's go another step further and consider a scenario where the entity has a 1-1 relationship with another entity, for instance:

One thing that is different about this scenario is that at the end of the operation, there might be more than one entity persisted. If the foreign key entity has no identity, then it needs to be presisted first, so that it's primary key may be used in the respective foreign key column. This is a cascade effect of the insert operation, which will shall consider to always happen whenever this scenarion arises.

Now, we'd like to also introduce two more things:

  1. Use the same connection to persist both entities
  2. Wrap the persistance in a transaction so that it is performed atomically.

Let's begin with considering the database schema

DROP TABLE PUBLIC.TBL_TASK;
CREATE TABLE PUBLIC.TBL_TASK (    ID UUID(16) DEFAULT RANDOM_UUID() NOT NULL,    NAME VARCHAR(50) NOT NULL,    DONE BOOLEAN DEFAULT FALSE NOT NULL,    TASK_CREATED TIMESTAMP DEFAULT NOW() NOT NULL,    PARENT_TASK UUID(16),    CONSTRAINT TASK_PK PRIMARY KEY (ID),    CONSTRAINT TASK_PARENT_FK FOREIGN KEY (PARENT_TASK) REFERENCES PUBLIC.TBL_TASK(ID) ON DELETE RESTRICT ON UPDATE RESTRICT);CREATE UNIQUE INDEX PRIMARY_KEY_4 ON PUBLIC.TBL_TASK (ID);CREATE INDEX TASK_PARENT_FK_INDEX_4 ON PUBLIC.TBL_TASK (PARENT_TASK);CREATE UNIQUE INDEX UNIQ_NAME_INDEX_4 ON PUBLIC.TBL_TASK (NAME);

The corresponsing Entity class representation would look like this:

class Task implements Entity {
    UUID id;    String name;    Boolean completed;    LocalDate dateCreated;    Task parentTask;

The ColumnInfo class will need additional attributes to mode this scenario, and the ColumnInfoBuilder class will need to be updated as necessary

ColumnInfo.clas
Boolean isPkColumn;Boolean isFkColumn;

The Task EntityMetadata will look as shown below

//Task Entityregistry.put(Task.class, new EntityMetadata(Task.class, "tbl_task", Set.of(    ColumnInfo.builder().columnName("id").attributeName("id").attributeType(UUID.class).isPkColumn(true).isAutoGenerated(true).build(),    ColumnInfo.builder().columnName("name").attributeName("name").build(),    ColumnInfo.builder().columnName("done").attributeName("completed").attributeType(Boolean.class).build(),    ColumnInfo.builder().columnName("task_created").attributeName("dateCreated").attributeType(LocalDate.class).build(),    ColumnInfo.builder().columnName("parent_task").attributeName("parentTask").attributeType(Task.class).isFKColumn(true).build())) {
    @Override    public Task entityInstance() {        return new Task();    }});

To support atomicity of the insert operation, we'll want to split the insert function into different functions to accomodate cascading the insert. The first part will borrow a connection, start a transaction on it, and hand it over to a helper insert function which will persist foreign key associations before persisting the top-level entity. This first part will also commit the transaction if it's successful or it will rollback the transaction if there was a problem.

public static <T extends Entity> InsertResult<T> insert(T entity) {    try {        Connection conn = DriverManager.getConnection(connectionString);        conn.setAutoCommit(false);        try {            InsertResult<T> result = insert(entity, conn);            conn.commit();            return result;        } catch (Exception e) {            conn.rollback();            return InsertResult.failure(e);        } finally {            if (conn != null) {                conn.setAutoCommit(true);                conn.close();            }        }    } catch (SQLException e) {        throw new InsertOpException("Could not get connection to insert data record");    }}

The second part will continue performing the insert operations the same way they have hitherto been done with a few enhancements - it will persist foreign key associations before anything else.

public static <T extends Entity> InsertResult<T> insert(T entity, Connection conn) {    EntityMetadata metadata = entity.metadata();    Map<String, Object> insertableColumns = new LinkedHashMap<>(); //be sure to use "linked" version of hashmap to retain order
    //handle foreign key associations first    if (metadata.containsFkColumns()) {        for (ColumnInfo fkColumnInfo : metadata.fkColumns()) {            if (fkColumnInfo.compositeColumns.length == 0) {                Entity fkEntity = (Entity) entity.get(fkColumnInfo.attributeName);                if (fkEntity != null) {                    InsertResult<Entity> insertResult = insert(fkEntity, conn);                    if (insertResult.error != null) {                        throw new InsertOpException(insertResult.error);                    }                    EntityMetadata fkEntityMetadata = fkEntity.metadata();                    if (!fkEntityMetadata.containsPkColumn()) {                        throw new InsertOpException(String.format("Expected %s entity to contain a primary key mapping", fkEntityMetadata.tableName));                    }                    ColumnInfo columnInfo = fkEntityMetadata.pkColumn();                    insertableColumns.put(fkColumnInfo.columnName, insertResult.result.get(columnInfo.attributeName));                }            }        }    }
    //proceed as before    resolveInsertableColumns(entity, insertableColumns); //...this will require some changes as well    //...rest of the code}

The resolveInsertableColumns function will need to skip foreign key columns since these values will already be existing in the map at this point

entityMetadata.columns.stream()    .filter(info -> !info.isFkColumn) //add this line to avoid over-writting fk values in the mape

Now fire up the insert operation and examine the database table to verify the expected rows were created;

public static void main(String[] args) {    Task task = new Task(null, "soma-tenor", false, LocalDate.now(),            new Task(null, "soma", false, LocalDate.now(), null));    InsertResult<Task> taskResult = insert(task);    if (taskResult.error != null) {        System.out.println(taskResult.error);    } else {        System.out.println(taskResult.result);    }}

Expected output

IDNAMEDONETASK_CREATEDPARENT_TASK
5b2c85ed-57c2-47db-ab2f-fce12ee620cdsomafalse2021-08-28 00:00:00
c3b38d75-3fed-4243-8e42-126df32a92ffsoma-tenorfalse2021-08-28 00:00:005b2c85ed-57c2-47db-ab2f-fce12ee620cd

Another interesting scenario with a 1-1 association is when the primary key is a composite key - the primary key conssts of more than one column. Let's consider this table definition in a relational database

DROP TABLE PUBLIC.TBL_TASK_V2;
CREATE TABLE PUBLIC.TBL_TASK_V2 (    NUM INTEGER NOT NULL,    NAME VARCHAR(50) NOT NULL,    DONE BOOLEAN DEFAULT FALSE NOT NULL,    TASK_CREATED TIMESTAMP DEFAULT NOW() NOT NULL,    PARENT_TASK_NUM INTEGER,    PARENT_TASK_NAME VARCHAR(50),    CONSTRAINT TASK_PK_V2 PRIMARY KEY (NUM,NAME),    CONSTRAINT TASK_PARENT_FK_V2 FOREIGN KEY (PARENT_TASK_NUM,PARENT_TASK_NAME) REFERENCES PUBLIC.TBL_TASK_V2(NUM,NAME) ON DELETE RESTRICT ON UPDATE RESTRICT);CREATE UNIQUE INDEX PRIMARY_KEY_3 ON PUBLIC.TBL_TASK_V2 (NUM,NAME);CREATE INDEX TASK_PARENT_FK_V2_INDEX_3 ON PUBLIC.TBL_TASK_V2 (PARENT_TASK_NUM,PARENT_TASK_NAME);

The primary key in this case in clearly num and name. The corresponding entity class would look like this:

TaskV2.java
class TaskV2 implements Entity {
    TaskId taskId;    Boolean completed;    LocalDate dateCreated;    TaskV2 parentTask;
TaskId.java
class TaskId implements Entity {
    Integer number;    String name;

With these new constraint, the ColunInfo would need an update to capture this new composite pk information in the entity metadata. The ComponntInfoBuilder also needs a similar change.

ColumnInfo.clas
String[][] compositeColumns;

The EntityRegistry will change to capture this new information as well. The key update in the entity's metadata is the information about composite columns in addition to marking the field as being a foreign key column.

//TaskId Entityregistry.put(TaskId.class, new EntityMetadata(TaskId.class, null, List.of(        ColumnInfo.builder().columnName("num").attributeName("number").attributeType(Integer.class).build(),        ColumnInfo.builder().columnName("name").attributeName("name").build()), EntityType.COMPOSITE_PK) {
    @Override    public TaskId entityInstance() {        return new TaskId();    }});
//TaskV2 Entityregistry.put(TaskV2.class, new EntityMetadata(TaskV2.class, "tbl_task_v2", List.of(        ColumnInfo.builder().attributeName("taskId").attributeType(TaskId.class).isCompositePk(true).build(),        ColumnInfo.builder().columnName("done").attributeName("completed").attributeType(Boolean.class).build(),        ColumnInfo.builder().columnName("task_created").attributeName("dateCreated").attributeType(LocalDate.class).build(),        ColumnInfo.builder().attributeName("parentTask").attributeType(TaskV2.class).isFKColumn(true)                .compositeColumns(new String[][]{{"parent_task_num", "num"}, {"parent_task_name", "name"}}).build())) {
    @Override    public TaskV2 entityInstance() {        return new TaskV2();    }});

If you try to run the insert now, only the top level Task will get persisted. This is becasue the existing logic does not handle the scenario where (fkColumnInfo.compositeColumns.length != 0), so let's make that addition.

if (fkColumnInfo.compositeColumns.length == 0) {    //...remains the same}else{    if (fkEntity != null) {        InsertResult<Entity> insertResult = insert(fkEntity, conn);        if (insertResult.error != null) {            throw new InsertOpException(insertResult.error);        }        EntityMetadata fkEntityMetadata = fkEntity.metadata();        if (!fkEntityMetadata.containsCompositePk()) {            throw new InsertOpException(String.format("Expected %s entity to contain a composite primary key mapping", fkEntityMetadata.tableName));        }        ColumnInfo columnInfo = fkEntityMetadata.compositePkColumn();        String[][] compositeColumns = fkColumnInfo.compositeColumns;        for (int i = 0; i < compositeColumns.length; i++) {            String fkColumnName = compositeColumns[i][0];            String targetColumn = compositeColumns[i][1];            ColumnInfo column = EntityRegistry.registry.get(columnInfo.attributeType).columns.stream()                    .filter(info -> info.columnName.equals(targetColumn))                    .findFirst()                    .orElseThrow(() -> new InsertOpException(String.format("Expected to find composite column mapping for %s", targetColumn)));            insertableColumns.put(fkColumnName, ((Entity) insertResult.result.get(columnInfo.attributeName)).get(column.attributeName));        }    }}

And with that change, the insert operation will now create two records and the populate the columns as expected.

public static void main(String[] args) {    TaskId somaTaskId = new TaskId(1, "soma");    TaskId tenorTaskId = new TaskId(2, "soma-tenor");    TaskV2 task = new TaskV2(tenorTaskId, false, LocalDate.now(),            new TaskV2(somaTaskId, false, LocalDate.now(), null));    InsertResult<TaskV2> taskResult = insert(task);    if (taskResult.error != null) {        System.out.println(taskResult.error);    } else {        System.out.println(taskResult.result);    }}

Table with many-to-1 relationship#

Visit Github Resources

Closely related to the 1-to-1 relationshop scenario, the entity class uses a Collection to model the foreign key relationship.

DROP TABLE PUBLIC.TBL_TASK;
CREATE TABLE PUBLIC.TBL_TASK (    ID UUID(16) DEFAULT RANDOM_UUID() NOT NULL,    NAME VARCHAR(50) NOT NULL,    DONE BOOLEAN DEFAULT FALSE NOT NULL,    TASK_CREATED TIMESTAMP DEFAULT NOW() NOT NULL,    PARENT_TASK UUID(16),    CONSTRAINT TASK_PK PRIMARY KEY (ID),    CONSTRAINT TASK_PARENT_FK FOREIGN KEY (PARENT_TASK) REFERENCES PUBLIC.TBL_TASK(ID) ON DELETE RESTRICT ON UPDATE RESTRICT);CREATE UNIQUE INDEX PRIMARY_KEY_4 ON PUBLIC.TBL_TASK (ID);CREATE INDEX TASK_PARENT_FK_INDEX_4 ON PUBLIC.TBL_TASK (PARENT_TASK);CREATE UNIQUE INDEX UNIQ_NAME_INDEX_4 ON PUBLIC.TBL_TASK (NAME);

The corresponsing entity class representation would not look like this:

class Task implements Entity {
    UUID id;    String name;    Boolean completed;    LocalDate dateCreated;    Task parentTask;    Collection<Task> dependsOn;

To capture this new information, the ColumnInfo class needs a new attribute. The corresponding update needs to be done to ColumnInfoBuilder as well.

Boolean isCollection;

The EnityMetadata will also need to be updated to capture this new requirement as well - the dependesOn field

//Task Entityregistry.put(Task.class, new EntityMetadata(Task.class, "tbl_task", Set.of(        ColumnInfo.builder().columnName("id").attributeName("id").attributeType(UUID.class).isPkColumn(true).isAutoGenerated(true).build(),        ColumnInfo.builder().columnName("name").attributeName("name").build(),        ColumnInfo.builder().columnName("done").attributeName("completed").attributeType(Boolean.class).build(),        ColumnInfo.builder().columnName("task_created").attributeName("dateCreated").attributeType(LocalDate.class).build(),        ColumnInfo.builder().columnName("parent_task").attributeName("parentTask").attributeType(Task.class).isFKColumn(true).build(),        ColumnInfo.builder().columnName("parent_task").attributeName("dependsOn").attributeType(Task.class).isCollection(true).build())) {
    @Override    public Task entityInstance() {        return new Task();    }});

The resolveInsertableColumns function needs a quick enhancement to disregard columns that as of collection type

entityMetadata.columns.stream()    //...same filters as before    .filter(info -> !info.isCollection)

To test this new scenario, let's consider using the data below:

public static void main(String[] args) {    Task measureIngredients = new Task(null, "gather ingredients needed", false, null);    Task pourFlour = new Task(null, "pour flour in a bowl", false, measureIngredients);    Task addWater = new Task(null, "mix flour with water", false, pourFlour);    Task addEggs = new Task(null, "beat two eggs into mix", false, addWater);    Task addFlavor = new Task(null, "add sugar to sweeten", false, addEggs);    Task heatSkillet = new Task(null, "heat up the pan", false, addFlavor);    Task pourMix = new Task(null, "pour mix to cook", false, heatSkillet);    Task flipPancake = new Task(null, "flip the pancake to cook", false, pourMix);    Task gatherCondiments = new Task(null, "gather breakfast condiments", false, measureIngredients);    Task setupTable = new Task(null, "pour syrup to serve", false, gatherCondiments);    Task servePancake = new Task(null, "serve pancake on a plate", false, flipPancake);    Task serveMilk = new Task(null, "pour milk to serve", false, setupTable);
    Task task = new Task(null, "serve breakfast", false, LocalDate.now(), null, List.of(servePancake, serveMilk));    InsertResult<Task> taskResult = insert(task);    if (taskResult.error != null) {        System.out.println(taskResult.error);    } else {        System.out.println(taskResult.result);    }}

If you try to run the insert operation without modifying the update function, you get get an error since the function will try to serialize the collection, which is not at all a goal. Let's make some changes in the insert function to remedy this

//handle collection associationsif (metadata.containsCollectionColumns()) {    for (ColumnInfo collectionColumnInfo : metadata.collectionColumns()) {        Collection<Entity> collectionAttribute = (Collection<Entity>) entity.get(collectionColumnInfo.attributeName);        if (collectionAttribute.size() > 0) {            for (Entity collectionEntity : collectionAttribute) {                InsertResult<Entity> insertResult = insert(collectionEntity, conn);                if (insertResult.error != null) {                    throw new InsertOpException(insertResult.error);                }            }        }    }}
//handle foreign key associationsif (metadata.containsFkColumns()) {    //...same as before

This approach will work if use the approach below if the insert is as shown below

Task task1 = new Task(null, "serve pancake on a plate", false, null);Task task2 = new Task(null, "pour milk to serve", false, null);
Task task = new Task(null, "serve breakfast", false, LocalDate.now(), null, List.of(task1, task2));InsertResult<Task> taskResult = insert(task);

However, if you keep everything above the same, and just make a slight change to include task1 as the parent task of task2, you will run into a Unique index or primary key violation: with the name column.

Task task2 = new Task(null, "pour milk to serve", false, task1); //make this clight change

That's because the current setup will first persist task1 when it's a collection item of task and then again when it's a "parentTask" of task2. So the remedy is to check that the foreign key entity item has no primary key value before persisting.

if (fkColumnInfo.compositeColumns.length == 0) {    if (fkEntity != null) {        EntityMetadata fkEntityMetadata = fkEntity.metadata();        if (!fkEntityMetadata.containsPkColumn()) {            throw new InsertOpException(String.format("Expected %s entity to contain a primary key mapping", fkEntityMetadata.tableName));        }        ColumnInfo columnInfo = fkEntityMetadata.pkColumn();        if (fkEntity.get(columnInfo.attributeName) != null) {            insertableColumns.put(fkColumnInfo.columnName, fkEntity.get(columnInfo.attributeName));        } else {            InsertResult<Entity> insertResult = insert(fkEntity, conn);            if (insertResult.error != null) {                throw new InsertOpException(insertResult.error);            }            insertableColumns.put(fkColumnInfo.columnName, insertResult.result.get(columnInfo.attributeName));        }    }}

The scenario above deals with the situation where the primary key is NOT a composite key. In the case of a composite key, let's examine the differences using the shcema described below

DROP TABLE PUBLIC.TBL_TASK_V2;
CREATE TABLE PUBLIC.TBL_TASK_V2 (    NUM INTEGER NOT NULL,    NAME VARCHAR(50) NOT NULL,    DONE BOOLEAN DEFAULT FALSE NOT NULL,    TASK_CREATED TIMESTAMP DEFAULT NOW() NOT NULL,    PARENT_TASK_NUM INTEGER,    PARENT_TASK_NAME VARCHAR(50),    CONSTRAINT TASK_PK_V2 PRIMARY KEY (NUM,NAME),    CONSTRAINT TASK_PARENT_FK_V2 FOREIGN KEY (PARENT_TASK_NUM,PARENT_TASK_NAME) REFERENCES PUBLIC.TBL_TASK_V2(NUM,NAME) ON DELETE RESTRICT ON UPDATE RESTRICT);CREATE UNIQUE INDEX PRIMARY_KEY_3 ON PUBLIC.TBL_TASK_V2 (NUM,NAME);CREATE INDEX TASK_PARENT_FK_V2_INDEX_3 ON PUBLIC.TBL_TASK_V2 (PARENT_TASK_NUM,PARENT_TASK_NAME);

The corresponding Entity would be as shown below

TaskV2
class TaskV2 implements Entity {
    TaskId taskId;    Boolean completed;    LocalDate dateCreated;    TaskV2 parentTask;    Collection<TaskV2> dependsOn;
TaskId
class TaskId implements Entity{
    Integer number;    String name;

The EnityMetadata will need to be updated to capture this new requirement as well - the dependesOn field

//TaskId Entityregistry.put(TaskId.class, new EntityMetadata(TaskId.class, null, Set.of(        ColumnInfo.builder().columnName("num").attributeName("number").attributeType(Integer.class).build(),        ColumnInfo.builder().columnName("name").attributeName("name").build()), EntityType.COMPOSITE_PK) {
    @Override    public TaskId entityInstance() {        return new TaskId();    }});
//TaskV2 Entityregistry.put(TaskV2.class, new EntityMetadata(TaskV2.class, "tbl_task_v2", Set.of(        ColumnInfo.builder().attributeName("taskId").attributeType(TaskId.class).isCompositePk(true).build(),        ColumnInfo.builder().columnName("done").attributeName("completed").attributeType(Boolean.class).build(),        ColumnInfo.builder().columnName("task_created").attributeName("dateCreated").attributeType(LocalDate.class).build(),        ColumnInfo.builder().attributeName("parentTask").attributeType(TaskV2.class).isFKColumn(true)                .compositeColumns(new String[][]{{"parent_task_num", "num"}, {"parent_task_name", "name"}}).build(),        ColumnInfo.builder().attributeName("dependsOn").attributeType(TaskV2.class).isCollection(true)                .compositeColumns(new String[][]{{"parent_task_num", "num"}, {"parent_task_name", "name"}}).build())) {
    @Override    public TaskV2 entityInstance() {        return new TaskV2();    }});

To test this new scenario, let's consider a simple use-case with the data below:

TaskV2 task1 = new TaskV2(new TaskId(1, "serve pancake on a plate"), false, null);TaskV2 task2 = new TaskV2(new TaskId(2, "pour milk to serve"), false, task1);TaskV2 task = new TaskV2(new TaskId(3, "serve breakfast"), false, LocalDate.now(), null, List.of(task1, task2));
InsertResult<TaskV2> taskResult = insert(task);if (taskResult.error != null) {    System.out.println(taskResult.error);} else {    System.out.println(taskResult.result);}

The only remaining change will be to check whether the entity is already persisted before persisting it again, liekwe did earlier.

if (fkColumnInfo.compositeColumns.length == 0) {    //...same as before}else{    if (fkEntity != null) {        EntityMetadata fkEntityMetadata = fkEntity.metadata();        if (!fkEntityMetadata.containsCompositePk()) {            throw new InsertOpException(String.format("Expected %s entity to contain a composite primary key mapping", fkEntityMetadata.tableName));        }        ColumnInfo columnInfo = fkEntityMetadata.compositePkColumn();        if (fkEntity.get(columnInfo.attributeName) != null) {            String[][] compositeColumns = fkColumnInfo.compositeColumns;            for (String[] compositeColumn : compositeColumns) {                String fkColumnName = compositeColumn[0];                String targetColumn = compositeColumn[1];                ColumnInfo column = EntityRegistry.registry.get(columnInfo.attributeType).columns.stream()                        .filter(info -> info.columnName.equals(targetColumn))                        .findFirst()                        .orElseThrow(() -> new InsertOpException(String.format("Expected to find composite column mapping for %s", targetColumn)));                insertableColumns.put(fkColumnName, ((Entity) fkEntity.get(columnInfo.attributeName)).get(column.attributeName));            }        }        else {            InsertResult<Entity> insertResult = insert(fkEntity, conn);            if (insertResult.error != null) {                throw new InsertOpException(insertResult.error);            }            String[][] compositeColumns = fkColumnInfo.compositeColumns;            for (String[] compositeColumn : compositeColumns) {                String fkColumnName = compositeColumn[0];                String targetColumn = compositeColumn[1];                ColumnInfo column = EntityRegistry.registry.get(columnInfo.attributeType).columns.stream()                        .filter(info -> info.columnName.equals(targetColumn))                        .findFirst()                        .orElseThrow(() -> new InsertOpException(String.format("Expected to find composite column mapping for %s", targetColumn)));                insertableColumns.put(fkColumnName, ((Entity) insertResult.result.get(columnInfo.attributeName)).get(column.attributeName));            }        }    }}

With these chnages, the insert operation should continue to work as expected.

Mapping relationship using adjuscent table#

Visit Github Resources

So far the relationship scenarios I have considered so far with the insert operation have assumed the relationship to be defined on the same table. Does having the relationship using two table make any difference?

non-composite pk#

Let's examine this scenation using the schema below where a User and a Task have a 1-to-1 relationship:

drop table if exists tbl_task;drop table if exists tbl_user;
create table if not exists tbl_user (    id UUID default random_uuid(),    email_address varchar(100) not null,    nickname varchar(50),    date_joined timestamp not null default now(),    access_level varchar default 'VISITOR',    addr_city varchar(50),    addr_state varchar(30),    addr_postal_code varchar(10),    constraint user_pk primary key(id));
create table if not exists tbl_task (    id UUID default random_uuid(),    name varchar(50) not null,    done boolean not null default false,    task_created timestamp not null default now(),    parent_task UUID,    task_assignee UUID,    constraint task_pk primary key(id),    constraint uniq_name unique (name),    constraint task_parent_fk foreign key(parent_task) references tbl_task(id),    constraint task_assignee_fk foreign key(task_assignee) references tbl_user(id));

The respective class representations will look like this:

Task entity
public class Task implements Entity {    UUID id;    String name;    Boolean completed;    LocalDate dateCreated;    Task parentTask;    Collection<Task> dependsOn;    //...the rest is similar to previous examples
User entity
public class User implements Entity {    UUID id;    String emailAddress;    String nickName;    AccessLevel accessLevel;    Address address;    LocalDate dateJoined;    Collection<Task> assignments;    //the rest similar to earlier examples

To model assignments being in a different table, we need to introduce another attribute to the ColumnInfo class, and the associated change necessary on the ColumnInfoBuilder class

String inverseFkColumn;

This new attribute needs to be added to the User entity's metadata, which describe the assignments relationship using a different table

ColumnInfo.builder().columnName("task_assignee").attributeName("assignments").attributeType(Task.class).isCollection(true)    .inverseFkColumn("id").build()

One last thing that needs to be updated to avoid the situation of trying to persist an already existing record, we need to check if an entity has an identifier before persisting, and if it does, we simply don't perform an insert operation.

private static Boolean entityIdentifierExists(Entity entity) {    EntityMetadata metadata = entity.metadata();    if (metadata.containsPkColumn()) {        ColumnInfo columnInfo = metadata.pkColumn(); //the pk MUST be auto-generated for this to work        return entity.get(columnInfo.attributeName) != null;    }    if (metadata.containsCompositePk()) {        //?? what should be done here ??    }    return false;}

With that, we need to add this check in the loop for persisting aollection entities.

//handle collection associationsif (metadata.containsCollectionColumns()) {    for (ColumnInfo collectionColumnInfo : metadata.collectionColumns()) {        Collection<Entity> collectionAttribute = (Collection<Entity>) entity.get(collectionColumnInfo.attributeName);        if (collectionAttribute.size() > 0) {            for (Entity collectionEntity : collectionAttribute) {                if(!entityIdentifierExists(collectionEntity)) {                    InsertResult<Entity> insertResult = insert(collectionEntity, conn);                    if (insertResult.error != null) {                        throw new InsertOpException(insertResult.error);                    }                }            }        }    }}

With these changes made, how different would the insert operation be? Well, let's start with this persisting this data

public static void main(String[] args) {    Task task1 = new Task(null, "serve pancake on a plate", false, null);    Task task2 = new Task(null, "pour milk to serve", false, task1);    Task task3 = new Task(null, "serve breakfast", false, LocalDate.now(), null, List.of(task1, task2);
    Address address = new Address("Chicago", "IL", "60060");    User user = new User(null, "jim@email.com", "jimmy", AccessLevel.USER, address, LocalDate.now(), List.of(            task1, task2, task3    ));    InsertResult<User> userResult = insert(user);    if (userResult.error != null) {        System.out.println(userResult.error);    } else {        System.out.println(userResult.result);    }}

The pleasant surprise result is that no other change is required to insert the records above correctly.

composite pk#

Let's now examine the same kind of schema, but one that has a composite pk this time around.

drop table if exists tbl_task_v2;drop table if exists tbl_user_v2;
create table if not exists tbl_user_v2 (  email_address varchar(100) not null,  username varchar(50) not null,  nickname varchar(50),  date_joined timestamp not null default now(),  access_level varchar default 'VISITOR',  addr_city varchar(50),  addr_state varchar(30),  addr_postal_code varchar(10),  constraint user_pk_v2 primary key(email_address, username));
create table if not exists tbl_task_v2 (  num int NOT null,  name varchar(50) not null,  done boolean not null default false,  task_created timestamp not null default now(),  parent_task_num int,  parent_task_name varchar(50),  task_assignee_email varchar(100),  task_assignee_uname varchar(50),  constraint task_pk_v2 primary key(num, name),  constraint task_parent_fk_v2 foreign key(parent_task_num, parent_task_name) references tbl_task_v2(num, name),  CONSTRAINT task_assignee_fk_v2 FOREIGN key(task_assignee_email, task_assignee_uname) REFERENCES tbl_user_v2(email_address, username));

The respective class representations will look like this:

class TaskV2 implements Entity {    TaskId taskId;    Boolean completed;    LocalDate dateCreated;    TaskV2 parentTask;    Collection<TaskV2> dependsOn;
    //...the rest is the same as before
class TaskId implements Entity, Comparable<TaskId> {    Integer number;    String name;
    //...the rest is the same as before
class UserV2 implements Entity {    UserId userId;    String nickName;    AccessLevel accessLevel;    Address address;    LocalDate dateJoined;    Collection<TaskV2> assignments;
    //...the rest is the same as before
class UserId implements Entity, Comparable<UserId> {    String emailAddress;    String userName;
    //...the rest is the same as before

To model assignments being in a different table, inverseFK attribute needs to be added to the UserV2 entity's metadata

ColumnInfo.builder().attributeName("assignments").attributeType(Task.class).isCollection(true)    .compositeColumns(new String[][]{{"task_assignee_email", "email_address"}, {"task_assignee_uname", "username"}})    .inverseFkColumn("id").build()

With these changes made, how different would the insert operation be? Well, let's start with this persisting this data

public static void main(String[] args) {    TaskV2 task1 = new TaskV2(new TaskId(1, "serve pancake on a plate"), false, null);    TaskV2 task2 = new TaskV2(new TaskId(2, "pour milk to serve"), false, task1);    TaskV2 task3 = new TaskV2(new TaskId(3, "serve breakfast"), false, LocalDate.now(), null, List.of(task1, task2));
    Address address = new Address("Chicago", "IL", "60060");    UserV2 user = new UserV2(new UserId("jim@email.com", "jimmy"), "salsa", AccessLevel.USER, address, LocalDate.now(), List.of(            task1, task2, task3    ));    InsertResult<UserV2> userResult = insert(user);    if (userResult.error != null) {        System.out.println(userResult.error);    } else {        System.out.println(userResult.result);    }}

Unfortunately, the surprise result in this case is that you'll get an error pointing to an attempt to persist a record twice. This will be true for any case where the primary key is not auto-generated, since the check we made previously assumed that if a pk value exists, then the entity must have already been persisted. If you apply the same reasoning in this scenario, the entities with non-auto-generated primary keys will not be persisted. This is currently a limitation, but we'll try an figure out a better way to handle this scenario later on.

Mapping relationship using junction table#

The scenario we shall consider next is when mapping a relationship, either 1-to-1 or many-to-1, using a join table. Let's begin with the schema below

create table if not exists tbl_user_v3 (    id UUID default random_uuid(),    email_address varchar(100) not null,    nickname varchar(50),    date_joined timestamp not null default now(),    access_level varchar default 'VISITOR',    addr_city varchar(50),    addr_state varchar(30),    addr_postal_code varchar(10),    constraint user_pk_v3 primary key(id));
create table if not exists tbl_task_v3 (    id UUID default random_uuid(),    name varchar(50) not null,    done boolean not null default false,    task_created timestamp not null default now(),    parent_task UUID,    constraint task_pk_v3 primary key(id),    constraint uniq_name_v3 unique (name),    constraint task_parent_fk_v3 foreign key(parent_task) references tbl_task(id));
create table if not exists tbl_user_task (    task_id UUID not null,    task_assignee UUID not null,    constraint user_task_id_fk foreign key(task_id) references tbl_task_v3(id),    constraint user_task_assignee_fk foreign key(task_assignee) references tbl_user_v3(id));

We shall borrow the exact same JoinTable concepts we used when examining the select operation. The User entity will therefore now use this new metadata for its assignment attribute

ColumnInfo.builder().attributeName("assignments").attributeType(Task.class).isCollection(true)    .joinTable(JoinTable.builder().tableName("tbl_task_user").inverseColumn("id").onColumn("task_id").whereColumn("task_assignee").build()).build()

With these changes made, how different would the insert operation be? Well, let's start with this persisting the data below:

public static void main(String[] args) {    Address address = new Address("Chicago", "IL", "60060");
    TaskV3 task1 = new TaskV3(null, "serve pancake on a plate", false, null);    TaskV3 task2 = new TaskV3(null, "pour milk to serve", false, task1);    TaskV3 task3 = new TaskV3(null, "serve breakfast", false, LocalDate.now(), null, List.of(task1, task2));
    UserV3 user = new UserV3(null, "jim@email.com", "jimmy", AccessLevel.USER, address, List.of(            task1, task2, task3    ));    InsertResult<UserV3> userResult = insert(user);    if (userResult.error != null) {        System.out.println(userResult.error);    } else {        System.out.println(userResult.result);    }}