Friday, May 9, 2025
Google search engine
HomeTechnologyBig DataAutomate replication of row-level safety from AWS Lake Formation to Amazon QuickSight

Automate replication of row-level safety from AWS Lake Formation to Amazon QuickSight


Amazon QuickSight is cloud-powered, serverless, and embeddable enterprise intelligence (BI) service that makes it easy to ship insights to your group. As a totally managed service, Amazon QuickSight helps you to create and publish interactive dashboards that may then be accessed from totally different gadgets and embedded into your purposes, portals, and web sites.

When authors create datasets, construct dashboards, and share with end-users, the customers will see the identical knowledge because the creator, except row-level safety (RLS) is enabled within the Amazon QuickSight dataset. Amazon QuickSight additionally offers choices to cross a reader’s id to an information supply utilizing trusted id propagation and apply RLS on the supply. To be taught extra, see Centrally handle permissions for tables and views accessed from Amazon QuickSight with trusted id propagation and Simplify entry administration with Amazon Redshift and AWS Lake Formation for customers in an Exterior Id Supplier.

Nonetheless, there are a number of necessities when utilizing trusted id propagation with Amazon QuickSight:

The authentication methodology for Amazon QuickSight should be utilizing AWS IAM Id Middle.
The dataset created utilizing trusted id propagation shall be a direct question dataset in Amazon QuickSight. QuickSight SPICE can’t be used with trusted id propagation. It’s because when utilizing SPICE, knowledge is imported (replicated) and subsequently the entitlements on the supply can’t be used when readers entry the dashboard.

This put up outlines an answer to mechanically replicate the entitlements for readers from the supply (AWS Lake Formation) to Amazon QuickSight. This resolution can be utilized even when the authentication methodology in Amazon QuickSight is just not utilizing IAM Id Middle and might work with each direct question and SPICE datasets in Amazon QuickSight. This allows you to make the most of auto scaling that comes with SPICE. Though we concentrate on utilizing a Lake Formation desk that exists in the identical account, you possibly can prolong the answer for cross-account tables as properly. When extracting knowledge filter guidelines for the desk in one other account, the execution position should have essential entry to the tables within the different account.

Use case overview

For this put up, let’s take into account a big monetary establishment that has carried out Lake Formation as its central knowledge lake and entitlement administration system. The establishment goals to streamline entry management and keep a single supply of fact for knowledge permissions throughout its whole knowledge ecosystem. Through the use of Lake Formation for entitlement administration, the monetary establishment can keep a sturdy, scalable, and compliant knowledge entry management system that serves as the muse for its data-driven operations and analytics initiatives. This method is especially essential for sustaining compliance with monetary laws and sustaining knowledge safety. The analytics group desires to construct an Amazon QuickSight dashboard for knowledge and enterprise groups.

Resolution overview

This resolution makes use of APIs of AWS Lake Formation and Amazon QuickSight to extract, rework, and retailer AWS Lake Formation knowledge filters in a format that can be utilized in QuickSight.

The answer has 4 key steps:

Extract and rework the row-level safety (knowledge filters) and permissions to knowledge filters for tables of curiosity from AWS Lake Formation.
Create a guidelines dataset in Amazon QuickSight.

We use the next key companies:

The next diagram illustrates the answer structure.

Conditions

To implement this resolution, it is best to have following companies enabled in the identical account

AWS Lake Formation and
Amazon QuickSight
AWS Id and Entry Administration (IAM) permissions: Be sure you have essential IAM permissions to carry out operation throughout all of the companies talked about within the resolution overview above
AWS Lake Formation desk with knowledge filters with proper permissions
Amazon QuickSight principals (Customers or Teams)

The beneath part exhibits how one can create Amazon QuickSight teams and AWS Lake formation tables and knowledge filters

Create teams in QuickSight

Create two teams in Amazon QuickSight: QuickSight_Readers and QuickSight_Authors. For directions, see Create a bunch with the QuickSight console.

You’ll be able to then kind the Amazon Useful resource Names (ARNs) of the teams as follows. These shall be used when granting permission in AWS Lake Formation for knowledge filters.

arn:aws:quicksight:<>:<>:group/<>/QuickSight_Readers
arn:aws:quicksight:<>:<>:group/<>/QuickSight_Authors

You may also get the ARN of the teams by executing the Amazon QuickSight CLI command list-groups. The next screenshot exhibits the output.

Create a desk in AWS Lake Formation

The next part is for instance functions and never essential for manufacturing use of this resolution. Full the next steps to create a desk in AWS Lake Formation utilizing pattern knowledge. On this put up, the desk known as saas_sales.

Obtain the file Saas Gross sales.csv.
Add the file to an Amazon S3 location.
Create a desk in AWS Lake Formation.

Create row-level safety (knowledge filter) in AWS Lake Formation

In AWS Lake Formation, knowledge filters are used to filter the info in a desk for a person or group. Full the next steps to create a knowledge filter:

Create a knowledge filter referred to as QuickSightReaderFilter within the desk saas_sales. For Row-level entry, enter the expression section=”Enterprise”.

Grant the Amazon QuickSight group entry to this knowledge filter. Use the reader group ARN from step one for SAML Customers and teams.

Grant the QuickSight_Authors group full entry to the desk. Use the reader group ARN from step one for SAML Customers and teams.

(Non-compulsory) You’ll be able to create one other desk referred to as second_table and create one other knowledge filter referred to as SecondFilter and grant permission to the QuickSight_Readers group.

Now that you’ve arrange the desk, permissions, and knowledge filters, you possibly can extract the row-level entry particulars for the QuickSight_Readers and QuickSight_Authors teams and the saas_sales desk in AWS Lake Formation, and create the principles dataset in Amazon QuickSight for the saas_sales desk.

Extract and rework knowledge filters and permissions from AWS Lake Formation utilizing a Lambda perform

In AWS Lake Formation, knowledge filters are created for every desk. There could be many tables in AWS Lake Formation. Nonetheless, for a group or a undertaking, there are solely a selected set of tables that the BI developer is excited by. Due to this fact, select an inventory of tables to trace and replace the info filters for. In a batch course of, for every desk in AWS Lake Formation, extract the info filter definitions and write them into Amazon S3 utilizing AWS Lake Formation and Amazon S3 APIs.

We use the next AWS Lake Formation APIs to extract the info filter particulars and permissions:

ListDataCellFilters – This API is used to record all the info filters in every desk that’s required for the undertaking
ListPermissions – This API is used to retrieve the permissions for every of the info filters extracted utilizing the ListDataCellFilters API

The Lambda perform covers three components of the answer:

Extract the info filters and permissions to knowledge filters for tables of curiosity from AWS Lake Formation
Rework the info filters and permission right into a format usable in Amazon QuickSight
Persist the reworked knowledge

Full the next steps to create an AWS Lambda perform:

On the Lambda console, create a perform referred to as Lake_Formation_QuickSight_RLS. Use Python 3.12 because the runtime and create a brand new position for execution.

Configure Lambda perform timeout to 2 minutes. This will differ relying on the variety of tables to be parsed and the variety of knowledge filters to be reworked.
Connect the next permissions to the Lambda execution position:

{
“Model”: “2012-10-17”,
“Assertion”: (
{
“Sid”: “VisualEditor0”,
“Impact”: “Enable”,
“Motion”: (
“lakeformation:ListDataCellsFilter”,
“lakeformation:ListPermissions”
),
“Useful resource”: “*”
},
{
“Sid”: “VisualEditor1”,
“Impact”: “Enable”,
“Motion”: “s3:PutObject”,
“Useful resource”: “arn:aws:s3:::/*”
}
)
}

Set the next setting variables for the Lambda perform:

Identify
Worth

S3Bucket
Worth of the S3 bucket the place the output information shall be saved

tablesToTrack
Checklist of tables to trace as JSON transformed to string

Tmp
/tmp

The Lambda perform will get the record of tables and S3 bucket particulars from the setting variables. The record of tables is given as a JSON array transformed to string. The JSON format is proven within the following code. The values for catalogId, DatabaseName, and Identify could be fetched from the AWS Lake Formation console.

(
{
“CatalogId”: “String”,
“DatabaseName”: “String”,
“Identify”: “String”
}
)

Add a folder named tmp.
Obtain the zip file Lake_Formation_QuickSight_RLS.zip.
Observe: That is pattern code for non-production utilization. It is best to work along with your safety and authorized groups to fulfill your organizational safety, regulatory, and compliance necessities earlier than deployment.
For the Lambda perform code, add the downloaded .zip file to the Lambda perform, on the Code tab.
Present essential entry to the execution position in AWS Lake Formation. Though the AWS Id and Entry Administration (IAM) permissions are given to the Lambda execution position, specific permission needs to be given to the position in AWS Lake Formation for the Lambda perform to get the main points concerning the knowledge filters. Due to this fact, it’s important to explicitly grant entry to the execution position to restrict the Lambda position to read-only admin. For extra particulars, see Viewing knowledge filters.

Within the following sections, we clarify what the Lambda perform code does in additional element.

Extract knowledge filters and permissions for knowledge filters and tables in AWS Lake Formation

The principle circulation of the code takes the record of tables as enter and extracts desk and knowledge filter permissions and knowledge filter guidelines. The method right here is to get the permissions for all the desk and in addition for the info filters utilized to the desk. This fashion, each full entry (desk degree) and partial entry (knowledge filter) could be extracted.


….
tablesToTrack= json.masses(os.environ(“tablesToTrack”))
lf_client = boto3.shopper(‘lakeformation’)
# For every desk within the record get the info filter guidelines connected to the desk.
for desk in tablesToTrack:
df_response= lf_client.list_data_cells_filter(
Desk= desk
)
d_filters += df_response(“DataCellsFilters”)

# Additionally, for every desk within the record get the record of permissions at desk degree.
# This determines who has entry to all rows within the desk.
tresponse=lf_client.list_permissions(
Useful resource= {
“Desk”: desk
}
)

d_permissions += tresponse(“PrincipalResourcePermissions”)
transformDataFilterRules(d_filters)
# For every knowledge filters fetched above, get the permissions.
# This determines the row degree safety for the tables.
for filter in d_filters:
p_response=lf_client.list_permissions(
Useful resource= {

“DataCellsFilter”: {
“DatabaseName”: filter (“DatabaseName”),
“Identify”: filter(“Identify”),
“TableCatalogId”: filter(“TableCatalogId”),
“TableName”: filter(“TableName”)
}

}
)
d_permissions += p_response(“PrincipalResourcePermissions”)

transformFilterandTablePermissions(d_permissions)

Rework knowledge filter definitions in to a format usable in Amazon QuickSight

The extracted permissions and filters are reworked to create a guidelines dataset in Amazon QuickSight. There are alternative ways to outline knowledge filters. The next determine illustrates a few of the instance transformations.

The perform transformDataFilterRules within the following code can rework a few of the OR and AND circumstances into Amazon QuickSight acceptable format. The next are the main points accessible within the reworked format:

Lake Formation catalog ID
Lake Formation database identify
Lake Formation desk identify
Lake Formation knowledge filter identify
Checklist of columns from all of the tables offered within the enter for which the info filter guidelines are outlined

See the next code:

def transformDataFilterRules(guidelines):
international complete_transformed_filter_rules
transformed_filter_rules = ()
filter_to_extract=()
complete_transformed_filter_rules = ()
col_headers=()
col_headers.append(“catalog”)
col_headers.append(“database”)
col_headers.append(“desk”)
col_headers.append(“filter”)

for rule in guidelines:
print(rule)
catalog=rule(“TableCatalogId”)
database = rule(“DatabaseName”)
desk = rule(“TableName”)
filter = rule(“Identify”)
row=()
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
logger.data(f”row==={row}”)

f_conditions = re.break up(‘ OR | or | and | AND ‘ , rule(“RowFilter”)(“FilterExpression”))

for f_condition in f_conditions:
logger.data(f”f_condition={f_condition}”)
f_condition = f_condition.exchange(“(“,””)
f_condition = f_condition.exchange(“)”,””)
filter_rule_column= f_condition.break up(“=”)
if len(filter_rule_column)>1:
filter_rule_column(0) = filter_rule_column(0).strip()
if not filter_rule_column(0).strip() in col_headers:
col_headers.append(filter_rule_column(0).strip())
i= col_headers.index(filter_rule_column(0).strip())
j= i- (len(row)-1)
if j>0:
for x in vary(1, j):
row.append(“”)
logger.data(f”i={i} j={j} {filter_rule_column(1)}”)
row.insert(i, filter_rule_column(1).exchange(“‘”,””))
print(row)
transformed_filter_rules.append(‘,’.be a part of(row))

row=()
row.append(catalog)
row.append(database)
row.append(desk)
row.append(filter)
max_columns = len(col_headers)
complete_transformed_filter_rules=()
for rule in transformed_filter_rules:
r = rule.break up(“,”)
to_fill = max_columns – len(r)
if to_fill>0:
for x in vary(1, to_fill+1):
r.append(“”)
complete_transformed_filter_rules.append(‘,’.be a part of(r))

complete_transformed_filter_rules.insert(0,’,’.be a part of(col_headers))

The next determine is an instance of the reworked file. The file accommodates the columns for each tables. When making a guidelines dataset for a selected desk, the information are filtered for that desk pulled into Amazon QuickSight.

The perform transformFilterandTablePermissions within the following code snippet combines and transforms the desk and knowledge filter permissions right into a flat construction that accommodates the next columns:

Amazon QuickSight group ARN
Lake Formation catalog ID
Lake Formation database identify
Lake Formation desk identify
Lake Formation knowledge filter identify

See the next code:

def transformFilterandTablePermissions(permissions):
international transformed_table_permissions,transformed_filter_permissions
# Learn and set desk degree entry
transformed_table_permissions = ()
transformed_filter_permissions = ()
transformed_filter_permissions.insert(0,”group,catalog,database,desk,filter”)
transformed_table_permissions.insert(0,”group,catalog,database,desk”)

for permission in permissions:
group=””
database=””
desk =””
catalog=””

p= permission(“Permissions”)

if “DESCRIBE” in p or “SELECT” in p:

group = permission(“Principal”)(“DataLakePrincipalIdentifier”)
if “Database” in permission(“Useful resource”):
catalog=permission(“Useful resource”)(“Database”)(“CatalogId”)
database=permission(“Useful resource”)(“Database”)(“Identify”)
desk = “*”
transformed_table_permissions.append(group + “,” + catalog+ “,” + database + “,” + desk)
transformed_filter_permissions.append(group+”,” +catalog + “,”+ database + “,”+ desk)
elif “TableWithColumns” in  permission(“Useful resource”)  or “Desk” in permission(“Useful resource”):
if “TableWithColumns” in  permission(“Useful resource”):
catalog=permission(“Useful resource”)(“TableWithColumns”)(“CatalogId”)
database = permission(“Useful resource”)(“TableWithColumns”)(“DatabaseName”)
desk = permission(“Useful resource”)(“TableWithColumns”)(“Identify”)
elif “Desk” in  permission(“Useful resource”):
catalog=permission(“Useful resource”)(“Desk”)(“CatalogId”)
database = permission(“Useful resource”)(“Desk”)(“DatabaseName”)
desk = permission(“Useful resource”)(“Desk”)(“Identify”)
transformed_table_permissions.append( group + “,” + catalog + “,” + database + “,” + desk)
transformed_filter_permissions.append(group+”,” +catalog + “,”+ database + “,”+ desk)
elif “DataCellsFilter” in permission(“Useful resource”):
catalog=permission(“Useful resource”)(“DataCellsFilter”)(“TableCatalogId”)
database = permission(“Useful resource”)(“DataCellsFilter”)(“DatabaseName”)
desk = permission(“Useful resource”)(“DataCellsFilter”)(“TableName”)
filter = permission(“Useful resource”)(“DataCellsFilter”)(“Identify”)
transformed_filter_permissions.append(group+”,” +catalog + “,”+ database + “,”+ desk+ “,”+ filter)

The next determine is an instance of the extracted knowledge filter and desk permissions. AWS Lake Formation can have knowledge filters utilized to any principal. Nonetheless, we concentrate on the Amazon QuickSight principals:

The QuickSight_Authors ARN has full entry to 2 tables. That is decided by remodeling the table-level permissions along with the info filter permissions.
The QuickSight_Readers ARN has restricted entry based mostly on filter circumstances.

Retailer the reworked guidelines and permissions in two separate information in Amazon S3

The reworked guidelines and permissions are then continued in a knowledge retailer. On this resolution, the reworked guidelines are written to an Amazon S3 location in CSV format. The identify of the information created by the Lambda perform are:

transformed_filter_permissions.csv
transformed_filter_rules.csv

See the next code:

with open(“/tmp/transformed_table_permissions.csv”, “w”) as txt_file:
for line in transformed_table_permissions:
txt_file.write(line + “n”) # works with any variety of parts in a line
txt_file.shut()
s3 = boto3.useful resource(‘s3’)
s3.meta.shopper.upload_file(Filename = “/tmp/transformed_table_permissions.csv”, Bucket= os.environ(‘S3Bucket’), Key = “table-permissions/transformed_table_permissions.csv”)

with open(“/tmp/transformed_filter_permissions.csv”, “w”) as txt_file:
for line in transformed_filter_permissions:
txt_file.write(line + “n”) # works with any variety of parts in a line
txt_file.shut()

s3.meta.shopper.upload_file(Filename = “/tmp/transformed_filter_permissions.csv”, Bucket= os.environ(‘S3Bucket’), Key = “filter-permissions/transformed_filter_permissions.csv”)

with open(“/tmp/transformed_filter_rules.csv”, “w”) as txt_file:
for line in complete_transformed_filter_rules:
txt_file.write(line + “n”) # works with any variety of parts in a line
txt_file.shut()

s3.meta.shopper.upload_file(Filename = “/tmp/transformed_filter_rules.csv”, Bucket= os.environ(‘S3Bucket’), Key = “filter-rules/transformed_filter_rules.csv”)

Create a guidelines dataset in Amazon QuickSight

On this part, we stroll by the steps to create a guidelines dataset in Amazon QuickSight.

Create a desk in Lake formation for the information

Step one is to create a desk in AWS Lake Formation for the 2 information, transformed_filter_permissions.csv and transformed_filter_rules.csv.

Though you possibly can instantly use an Amazon S3 connector in Amazon QuickSight, making a desk and making the principles dataset utilizing an Athena connector offers flexibility in writing customized SQL and utilizing direct question. For the steps to carry an Amazon S3 location into AWS Lake Formation, see Creating tables.

For this put up, the tables for the information are created in a separate database referred to as quicksight_lf_transformation.

Grant permission for the tables to the QuickSight_Authors group

Grant permission in AWS Lake Formation for the 2 tables to the QuickSight_Authors group. That is important for Amazon QuickSight authors to create a guidelines dataset in Amazon QuickSight. The next screenshot exhibits the permission particulars.

Create a guidelines dataset in Amazon QuickSight

Amazon QuickSight helps each user-level and group-level RLS. On this put up, we use teams to allow RLS. To create the principles dataset, you first be a part of the filter permissions desk with the filter guidelines desk on the columns catalog, database, desk, and filter. Then you possibly can filter the permissions to incorporate the Amazon QuickSight principals, and embody solely the columns required for the dataset. The target on this resolution is to construct a guidelines dataset for the saas_sales desk.

Full the next steps:

On the Amazon QuickSight console, create a brand new Athena dataset.
Specify the next:

For Catalog, select AWSDataCatalog.
For Database, select quicksight_lf_transformation.
For Desk, select filter_permissions.

Select Edit/Preview knowledge.
Select Add knowledge.
Select Add supply.
Choose Athena.
Specify the next:

For Catalog, select AWSDataCatalog.
For Database, select quicksight_lf_transformation.
For Desk, select filter_rules.

Be part of the permissions desk with the info filter guidelines desk on the catalog, database, desk and filter columns.
Rename the column group as GroupArn. This must be executed earlier than filter is utilized.
Filter the info the place column desk equals saas_sales.
Filter the info the place column group can be filtered for values beginning with arn:aws:quicksight (Amazon QuickSight principals).

Exclude fields that aren’t a part of the saas_sales desk.
Change Question mode to SPICE.
Publish the dataset.

In case your group has a mapping of different principals to a Amazon QuickSight group or consumer, you possibly can apply that mapping earlier than becoming a member of the tables.

You may also write the next customized SQL to attain the identical end result:

SELECT a.”group” as GroupArn, section FROM “QuickSight_lf_transformation”.”filter_permissions” as a
left be a part of
“QuickSight_lf_transformation”.”filter_rules” as b
on
a.catalog = b.catalog and
a.database = b.database and
a.”desk” = b.”desk” and
a.filter = b.filter
the place a.”desk” = ‘saas_sales’
and a.”group” like ‘arn:aws:quicksight%’

Identify the dataset LakeFormationRLSDataSet and publish the dataset.

Check the row-level safety

Now you’re prepared to check the row-level safety by publishing a dashboard as a consumer within the QuickSight_Authors group after which viewing the dashboard as a consumer within the QuickSight_Readers group.

Publish a dashboard as a QuickSight_Authors group consumer

As an creator who belongs to the QuickSight_Authors group, the consumer will be capable of see the saas_sales desk within the Athena connector and all the info within the desk. As proven on this part, all three segments are seen for the creator when creating an evaluation and viewing the printed dashboard.

Create a dataset by pulling knowledge from the saas_sales desk utilizing the Athena connector.

Connect LakeFormationRLSDataSet because the RLS dataset for the saas_sales dataset. For directions, see Utilizing row-level safety with user-based guidelines to limit entry to a dataset.

Create an evaluation utilizing the saas_sales dataset as an creator who belongs to the QuickSight_Authors group.
Publish the dashboard.

Share the dashboard with the group QuickSight_Readers.

View the dashboard as a QuickSight_Readers group consumer

Full the next steps to view the dashboard as a QuickSight_Readers group consumer:

Log into Amazon QuickSight as a reader who belongs to the QuickSight_Readers group.

The consumer will be capable of see solely the section Enterprise.

Now, change the RLS in AWS Lake Formation, and set the section to be SMB for the QuickSightReaderFilter.
Run the Lambda perform to export and rework the brand new knowledge filter guidelines.
Refresh the SPICE dataset LakeFormationRLSDataSet in Amazon QuickSight.
When the refresh is full, refresh the dashboard within the reader login.

Now the reader consumer will see SMB knowledge.

Cleanup

Amazon QuickSight assets

Delete the Amazon QuickSight dashboard and evaluation created
Delete the datasets saas_sales and LakeFormationRulesDataSet
Delete the Athena knowledge supply
Delete the QuickSight teams utilizing the DeleteGroup API

AWS Lake Formation assets

Delete the database quicksight_lf transformation created in AWS Lake Formation
Revoke permission given to the Lambda execution position
Delete the saas_sales desk and knowledge filters created
You probably have used Glue crawler to create the tables in AWS Lake Formation, take away the Glue crawler as properly

Compute assets

Delete the AWS Lambda perform created
Delete the AWS Lambda execution position related to the lambda

Storage assets

Empty the content material of the Amazon S3 bucket created for this resolution
Delete the Amazon S3 bucket

Conclusion

This put up defined easy methods to replicate row-level safety in AWS Lake Formation mechanically in Amazon QuickSight. This makes certain that the SPICE dataset in QuickSight can use row-level entry outlined in Lake Formation.

This resolution will also be prolonged for different knowledge sources. The logic to programmatically extract the entitlements from the supply and rework them into Amazon QuickSight format will differ by supply. After the extract and rework are in place, it might scale to a number of groups within the group. Though this put up laid out a primary method, the automation needs to be both scheduled to run periodically or triggered based mostly on occasions like knowledge filters change or grant or revoke of AWS Lake Formation permissions to guarantee that the entitlements stay in sync between AWS Lake Formation and Amazon QuickSight.

Check out this resolution in your personal use case, and share your suggestions within the feedback.

Concerning the Authors

Vetri Natarajan is a Specialist Options Architect for Amazon QuickSight. Vetri has 15 years of expertise implementing enterprise enterprise intelligence (BI) options and greenfield knowledge merchandise. Vetri makes a speciality of integration of BI options with enterprise purposes and allow data-driven selections.

Ismael Murillo is a Options Architect for Amazon QuickSight. Earlier than becoming a member of AWS, Ismael labored in Amazon Logistics (AMZL) with supply station administration, supply service suppliers, and our buyer actively within the area. Ismael centered on final mile supply and supply success. He designed and carried out many modern options to assist cut back price, affect supply success. He’s additionally a United States Military Veteran, the place he served for eleven years.



Supply hyperlink

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments