The plugin can add both sitemaps to robots.txt automatically but only if you let WordPress itself handle robots.txt requests. This means (1) that you need to remove the static robots.txt file from your site root and (2) there are no htaccess (or nginx) rules that prevent any robots.txt requests from reaching index.php (which should be the case on default installations)
What I suggest is:
1. rename your static robots.txt file to robots.bak
2. visit your site’s /robots.txt in your browser to see if you get a valid response.
If you get a 404 page response or error, your easiest option is to just rename robots.bak to robots.txt again and add the two sitemaps manually:
Sitemap: https://example.com/sitemap.xml
Sitemap: https://example.com/sitemap-news.xml
(convert https://example.com to match your site domain and protocol)
But if all is well, your browser should show the default WordPress generated robots.txt with the sitemaps added by the plugin:
# XML Sitemap & Google News version 5.2.7 - https://status301.net/wordpress-plugins/xml-sitemap-feed/
Sitemap: https://demo.status301.net/sitemap.xml
Sitemap: https://demo.status301.net/sitemap-news.xml
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Next, you can open your old robots.bak file and copy all custom rules that you wish to add to the new robots.txt (not a real file!) by going to Settings > Reading in your WordPress admin, and pasting your custom rules in the field Additional robots.txt rules.
Attention: these custom rules will be automatically appended to the robots.txt response, below the default rules. You may need to modify them to make sense in that context!