Custom Tables

This article will explain what custom data tables are, why they are useful, how to set them up, and how to use them from within P3L.

What are custom tables?

Custom tables are data that you can create and store in Paperless Parts. Custom tables are essentially CSV files, but with slightly more rigid enforcement of column names and row values. 

Why are custom tables useful?

Custom tables are useful because it allows you to create small or large collections of data that your pricing formulas can reference. From within P3L, you can perform filtering actions on any columns in your data table to look up pertinent information based on characteristics of the current part you are quoting. As long as you can collect your data into a clean CSV, you can use it to influence pricing. A common use case for custom tables is to perform look ups for material pricing/properties given the material and geometric properties of a part.

How do I set up my custom tables?

You can manage your custom data tables from the processes page. Additionally, you can create, update, and delete custom tables using our open API. Every table must have a unique alpha-numeric name. Every table will have a discrete number of type-enforced columns. Tables can have as many rows as you like. Columns must have alpha-numeric names WITHOUT spaces and must be one of three types: boolean, numeric, or string. When adding or updating rows, these types will be enforced. When performing look ups on these tables, the conditional values will also by type-enforced. 

Look for the "Tables" tab in the processes page. Here you can create your new tables:

Once you create a new table on this page, you can click on any row to configure the details of a table.

To add data to your custom table, you need to provide a configuration csv and a data csv file. The configuration file will have two columns: column_name and value_type. The column_name is the alpha-numeric name of the column that will be used to perform lookups from within your P3L formulas. The value_type column can be one of three values: boolean, numeric, or string. The data file you upload must have matching column names to the ones specified in the config file. Additionally, the values for every cell in your data csv must be castable to the value_type specified for that column in the config csv. Your csv files will not upload properly if any issues are found with format, column names, or typing. To perform this action, click the "Update Table" option from the gear drop down in the top right.

After submitting, you should now be able to view your table data.

To edit the configuration or data in your table, navigate to the gear to upload new csv files. (Note that if you have previously provided a configuration file and you just want to update your data, you can provide just a csv with the new row data. You do not have to provide a configuration file again. But note that only columns with matching column names to the config file will actually populate to your custom table.)

Additionally, in the gear drop down, you can download csv files for both the table configuration and table data. Furthermore, once you have set up P3L formulas that reference this table, you will be able to quickly view those from the action gear drop down.

How do I use my custom tables in my pricing formulas?

Accessing information from your custom tables is available in both operation-level and process-level P3L. Similar to variables using the var function from operation P3L, you create a table_var that references a specific table along with any filtering and ordering conditions:

table_var(name: str, description: str, table_name: str, filters: List[filter | exclude], order_by: List[str], display_column_name: str)

This declares a table variable with the given name and description from the specified table with the name table_name. The filters input should always be the output of a create_filter() function call (see below). The order_by input should always be the output of a create_order_by() function call (see below). The filters and order_by will dictate how the table is queried and the result is ordered. The output of this function will either be None if no rows are found to satisfy the query filters, or it will be the first row that satisfies the query filters adhering to the ordering specified by order_by. You can use dot notation to access the columns of the outputted row if it is not None. The display_column_name is the name of the column that will be used as the display value for the row in the quoting interface. Similar to dynamic variables, table variables can be overridden in the quoting interface. See below for examples of usage.

create_filter(*args: [filter | exclude])

This function takes an arbitrary number of inputs of the form of a filter() or an exclude() function call. See below for examples of usage.

filter(column_name: str, condition: str, value)

This creates an inclusive filter based on the column_name, the condition, and the value. This will include all rows that satisfy the condition described by the inputs. See below for information on supported condition types. The value needs to match the characteristic type of the column in the table specified by the column_name. The value can be dynamic such as the properties in the part object or upstream dynamic variables.

exclude(column_name: str, condition: str, value)

This creates an  exclusive filter based on the column_name, the condition, and the value. This will exclude all rows that satisfy the condition described by the inputs. See below for information on supported condition types. The value needs to match the characteristic type of the column in the table specified by the column_name. The value can be dynamic such as the properties in the part object or upstream dynamic variables.

create_order_by(*args: str)

This function takes an arbitrary number of string arguments that are column names on the custom table. This will inform how to sort the rows that satisfy the filters provided to the table variable. Establishing how to order the rows is important because the output of the table_var function is the first row that satisfies the filter after applying ordering. When specifying a column name to sort by, it will be in ascending order. However, if you want to order in descending fashion, simply add a "-" before the input. For example, to order rows by a column named diameter so that the row with the smallest diameter value is returned first, you would do create_order_by('diameter'). To order rows by a column named diameter so that the row with the greatest diameter is returned first, you would do create_order_by('-diameter'). This also supports multiple column ordering, so something like create_order_by('diameter', 'length', '-thickness') is valid.

create_range(a, b)

This function takes numeric-like arguments  a and b and creates a range to use as the value for a filter() or exclude() function call with 'range' as the specified condition. These inputs can be variables or literals. See the table below for

Filter Conditions

The conditions below can be used with both the *filter()* and *exclude()* functions. Lets assume we have a custom table with column names  material (string), diameter (numeric), and length (numeric).

Condition Compatible Types Usage
= Boolean, numeric, string filter('material', '=', 'Aluminum 6061-T6')
> numeric, string exclude('length', '>', 10)
>= numeric, string filter('diameter', '>=', 5)
< numeric, string exclude('length', '<', 10)
<= numeric, string filter('diameter', '<=', 5)
range numeric filter('diameter', 'range', create_range(5, 6))
contains string filter('material', 'contains', 'Aluminum')

Let's walk through an example that references the material_inventory table above. For a lathe process, let's set up a material op that looks to the custom table to find the cheapest bar in terms of cost per length. We need to make sure this bar is large enough to fit at least one unit, so our rod stock must have a diameter and length greater than or equal to our final part geometry. We will then sort our result to ensure we grab the row with the smallest diameter, greatest length (as the greater the length the cheaper it will be in terms of cost per length), and the smallest cost. Based on our selection, we will then determine a parts per rod, and resolve a cost based on how many units we have to make. We then store values like rod length and diameter in the workpiece so we can reference those in downstream operations.

units_in()
lathe = analyze_lathe()

part_diameter = var('Part Diameter', 0, '', number, frozen=False)
part_diameter.update(2 * lathe.stock_radius)
part_diameter.freeze()

part_length = var('Part Length', 0, '', number, frozen=False)
part_length.update(lathe.stock_length)
part_length.freeze()

row = table_var(
  'Material Selection',
  'look up for dynamic cylindrical stock selection',
  'material_inventory',
  create_filter(
    filter('material', '=', part.material),
    filter('diameter', '>=', part_diameter),
    filter('length', '>=', part_length),
  ),
  create_order_by('diameter', '-length', 'bar_cost'),
  'display'
)

if row:
  set_operation_name(row.display)

rod_cost = var('Rod Cost', 0, '', currency, frozen=False)
if row:
  rod_cost.update(row.bar_cost)
rod_cost.freeze()

rod_diameter = var('Rod Diameter', 0, '', number, frozen=False)
if row:
  rod_diameter.update(row.diameter)
rod_diameter.freeze()

rod_length = var('Rod Length', 0, '', number, frozen=False)
if row:
  rod_length.update(row.length)
rod_length.freeze()
  
cut_width = var('Cut Width', 0.25, 'width of cut on inches for saw', number)

parts_per_rod = var('Parts Per Rod', 0, '', number, frozen=False)
parts_per_rod.update(floor(rod_length / (part_length + cut_width)))
parts_per_rod.freeze()

rod_count = ceil(part.qty / parts_per_rod)

PRICE = rod_count * rod_cost
DAYS = 0

set_workpiece_value('rod_diameter', rod_diameter)
set_workpiece_value('rod_length', rod_length)

Let's walk through another example so we get an even better sense for how to take advantage of custom tables. Let's say we have a laser cutting operation for a sheet metal process. For our laser cutter, we have pierce times and cut rates based on the type of the material and the thickness of the material. So, let's imagine we set up a custom table with the name 'laser_cut_rates'. The columns (types) for this custom table are material_family (string), thickness (numeric), cut_rate (numeric), and pierce_time (numeric). That custom table would look something like the following:

The operation below would query that custom table based on the material family and the detected thickness from sheet metal interrogation. We then determine a runtime based on the total number of pierces and cut length extracted by the Paperless sheet metal interrogation.

units_in()

sm = analyze_sheet_metal()

thickness = var('Thickness', 0, 'Thickness of material', number, frozen=False)
thickness.update(sm.thickness)
thickness.freeze()

# query table for thickness plus or minus one gauge width
row = table_var(
  'Material Selection',
  'Cutting properties for thickness and material family',
  'laser_cut_rates',
  create_filter(
    filter('material_family', '=', part.material_family),
    filter('thickness', '>', thickness),
  ),
  create_order_by('thickness', '-cut_rate'),
  'cut_rate'
)

cut_rate = var('Cut Rate', 0, 'inches per minute', number, frozen=False)
if row:
  cut_rate.update(row.cut_rate)
cut_rate.freeze()

pierce_time = var('Pierce Time', 0, 'pierce time in seconds', number, frozen=False)
if row:
  pierce_time.update(row.pierce_time)
pierce_time.freeze()

total_pierce_seconds = sm.pierce_count * pierce_time
total_travel_time = sm.total_cut_length / cut_rate

runtime = var('runtime', 0, 'runtime in hours', number, frozen=False)
runtime.update(total_pierce_seconds / 3600 + total_travel_time / 60)
runtime.freeze()

rate = var('Rate', 50, '$/hr', currency)

PRICE = runtime * rate * part.qty
DAYS = 0

Custom tables are going to be useful when you need to interact with some type of look up table that has more than a few rows. Additionally, interacting with custom tables through the table_var P3L function is very useful for look ups that often need to be overridden at quote time.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us