-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: In-Memory DmlHandler Test Fixture #809
Conversation
return results; | ||
} | ||
|
||
insertRow(tableName: string, row: any): DataRow { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of any
, it is better to introduce something like type RowType = Record<string, string|number>
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep that's a good idea. I will do so. The only thing is that it might possible for the type to be things other than string or number. Such as Buffer or Date. But I think the return from the actual Postgres query would serialize the result to a string anyway? It needs more testing, but string and number should sufficiently cover use cases for the time being.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually a quick test on the receivers indexer shows we insert a string type into the date
type and that works. So, I actually need to update the type gen a bit here to allow string or Date type. I just need to later test if inserting a Date type even works at all. But at least I know string works. It seems the JSON and JSONB options are what are problematic. You can input any valid JSON type which is Object, array, string, number, and so on. So any
actually makes sense as the input really is anything
.
57af68d
to
5af91d1
Compare
import { TableDefinitionNames } from "../indexer"; | ||
import InMemoryDmlHandler from "./dml-handler-fixture"; | ||
|
||
const DEFAULT_ITEM_1_WITHOUT_ID = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I may later on try to make some kind of builder function for creating these objects, but for now a couple static objects suffice for most unit test cases.
|
||
constructor(data: any, primaryKeys: string[]) { | ||
this.data = data; | ||
this.primaryKeys = primaryKeys.sort(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I sort the primary keys to ensure that the below primaryKey() function can reliably stringify the primary key attribute values and compare against other object's calls of it without worrying about order.
I did think about maybe comparing attribute values directly too though. Still not totally sure which option I want to go for.
private getSerialValue(columnName: string): number { | ||
const serialCounterKey = `${this.specification.tableName}-${columnName}`; | ||
let counterValue: number = this.serialCounter.get(serialCounterKey) ?? 0; | ||
this.serialCounter.set(serialCounterKey, counterValue + 1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Auto-increments each time this function is called. It is intended behavior for this function to work standalone and separately from actual column values. In Postgres, if there is a SERIAL column and you specify a value, it will actually not cause the counter to start after your specified value.
const insertedRows: PostgresRowEntity[] = []; | ||
|
||
for (const row of rowsToInsert) { | ||
const rowCopy = this.copyRow(row); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since JS objects are passed by reference, I need to copy the data to ensure that input or output data don't impact stored table data.
I chose to do this copying as part of the interface DML functions rather than more closely to functions like insertRow to hopefully make it more clear that this copying is being performed. Open to suggesitons on any other ideas for where this copying should be done.
I went with parsing a stringify to do a deep copy, which should work for nearly all cases.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Really minor comments, overall looks good :)
274d8f9
to
31757f5
Compare
d6d849f
to
29b2748
Compare
In order to enable quick testing of indexer code for the purposes of local indexer development, there needs to be a way to functionally mock dependencies that are used by the Indexer code. These dependencies are roughly encapsulated under the context object. In particular context.db is of high importance as it is the method through with Indexers interact with their persistent data. This PR focuses on creating an MVP of an in memory DML Handler Test Fixture which can be used interchangeably with an actual DmlHandler. This allows context.db calls to pretend to be real calls to an actual Postgres DB without actually having an instance running (Which would be hard to integrate into unit testing).
This PR is mainly focused on getting a basic DmlHandler test fixture out the door. It has the same functions which roughly behave similarly to if they were called on an actual DB. There is the added benefit of being extremely quick to tear down as all data is represented in memory, making it suitable for fast iteration through unit testing. However, since I am essentially mocking Postgres DB responses, the correctness standard is much lower than using an actual PG DB, but since DmlHandler simplifies user interactions with the DB anyway, we can mock a great deal of Indexer context.db use cases sufficiently to serve as useful for end users.