Compare commits

...

28 Commits
8.5 ... 3.2

Author SHA1 Message Date
Alex Vandiver
0aa67c0c99 Release Zulip Server 3.2 2020-09-15 15:58:33 -07:00
Aman
8d67598ff2 provision: Fix missing <sasl/sasl.h> headers during provision.
(cherry picked from commit 7b9fe77bf1)
2020-09-15 01:48:26 -07:00
Anders Kaseorg
34a13c8094 requirements: Remove django-cookies-samesite.
Its functionality was added to Django upstream in 2.1.  Also remove
the SESSION_COOKIE_SAMESITE = 'Lax' setting since it’s the default.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
(cherry picked from commit e84c7fb09f)
2020-09-15 01:16:49 -07:00
Alex Vandiver
36ce1ce75e filters: Fix tests for deac48810d.
deac48810d cherry-picked the behaviour changes of 4167517a6f, but not
the test changes to go with it.
2020-09-11 19:49:34 -07:00
Alex Vandiver
f36b935f0e puppet: Restrict postfix incoming addresses to postmaster and zulip.
This removes the possibility of local user enumeration via RCPT TO.
2020-09-11 18:50:47 -07:00
Tim Abbott
c2508c0966 docs: Fix spelling of audit_fix_indexes.
Fixes #16140.
2020-08-14 16:57:03 -07:00
Mohit Gupta
deac48810d filters: Fix has:image and avoid future issues for other has filters.
This fixes a bug with the original frontend-side implementation for
has: filters, where it would incorrectly not match content in cases
where the message's nesting structure did not have an outer tag.

Bug was introduced in 02ea52fc18.

Fixes #16118.
2020-08-14 16:53:16 -07:00
Alex Vandiver
c316f267e7 docs: Add explicit steps to verify FTS indexes after upgrading to 3.0.
The OS upgrade paths which go through 2.1 do not call
`upgrade-zulip-stage-2` with `--audit-fts-indexes` because that flag
was added in 3.0.

Add an explicit step to do this audit after the 3.0 upgrade.  Stating
it as another command to run, rather than attempting to tell them
to add it to the `upgrade-zulip` call that we're linking to seems
easiest, since that does not dictate if they should upgrade to a
release or from the tip of git.

We do not include a step describing this for the Trusty -> Xenial
upgrade, because the last step already chains into Xenial -> Bionic,
which itself describes auditing the indexes.

Fixes #15877.
2020-08-12 12:49:23 -07:00
Alex Vandiver
87e02760bf docs: Be explicit about continuing with upgrades.
Strongly suggest Xenial -> Bionic, or upgrading to 3.x, at the end of
the various other upgrading steps.
2020-08-12 12:49:23 -07:00
Alex Vandiver
0b7be2610c docs: Fold "check if it is working" into the last step. 2020-08-12 12:49:23 -07:00
Alex Vandiver
94f57ad8bd docs: Don't suggest --audit-fts-indexes for non-3.0 upgrades.
Only Zulip 3.0 and above support the `--audit-fts-indexes` option to
`upgrade-zulip-stage-2`; saying "same as Bionic to Focal" on other
other steps, which are for Zulip 2.1 or 2.0, will result in errors.

Provide the full text of the updated `upgrade-zulip-stage-2` call in
step 5 for all non-3.0 upgrades.  For Trusty to Xenial and Stretch to
Buster, we do not say "Same as Xenial to Bionic" , because it is
likely that readers do not notice that step does not read "Same as
Bionic to Focal."
2020-08-12 12:49:23 -07:00
Tim Abbott
17e4b34f10 docs: Clarify how manage.py backup --output works. 2020-08-09 17:44:26 -07:00
Tim Abbott
5bf521fa55 compose: Fix buggy message post policy warning.
The previous logic with `new Date` produced invalid values for
differences longer than a year.
2020-08-06 15:44:38 -07:00
sahil839
29dd22e405 stream_edit: Send values of changed settings only to backend.
This commit changes change_stream_privacy function to only
send the values of changed settings to backend.
We also avoid sending PATCH request if none of the settings
in stream privacy modal are changed.

This change also fixes the bug in changing stream permissions
for realms with limited plans.

Fixes #16024.
2020-08-06 13:03:57 -07:00
Anders Kaseorg
efe9cbba29 memcached: Switch from pylibmc to python-binary-memcached.
Backported to 3.x by tabbott.

Signed-off-by: Anders Kaseorg <anders@zulip.com>
2020-08-06 12:52:38 -07:00
Mateusz Mandera
b0d2094967 auth: Treat emails case-insensitively in ExternalAuthResult.
Our intent throughout the codebase is to treat email
case-insensitively.
The only codepath affected by this bug is remote_user_sso, as that's the
only one that currently passes potentially both a user_profile and
ExternalAuthDataDict when creating the ExternalAuthResult. That's why we
add a test specifically for that codepath.
2020-08-05 11:40:51 -07:00
Casper Kvan Clausen
584d71a221 puppet: Support nginx_listen_port with http_only 2020-08-03 18:44:02 -07:00
Tim Abbott
12ac89ef3f tornado: Fix ID lists leaked to the events API.
Apparently, `update_message` events unexpectedly contained what were
intended to be internal data structures about which users were
mentioned in a given message.

The bug has been present and accumulating new data structures for
years.

Fixing this should improve the performance of handling update_message
events as well as cleaning up this API's interface.

This was discovered by our automated API documentation schema checking
tooling detecting these unexpected elements in these event
definitions; that same logic should prevent future bugs like this from
being introduced in the future.
2020-08-03 18:27:57 -07:00
Mateusz Mandera
3870a1b304 find_account: Fix the email search query.
The search should be case-insensitive.
2020-08-02 12:37:44 -07:00
Tim Abbott
928b8ad031 version: Update version after 3.1 release. 2020-07-30 15:52:52 -07:00
Tim Abbott
31f7006309 Release Zulip Server 3.1. 2020-07-30 15:44:18 -07:00
arpit551
d8b966e528 migrations: Upgrade migrations to remove duplicates in all Count tables.
This commit upgrades 0015_clear_duplicate_counts migration to remove
duplicate count in StreamCount, UserCount, InstallationCount as well.

Fixes https://github.com/zulip/docker-zulip/issues/266
2020-07-30 15:18:07 -07:00
Mateusz Mandera
444359ebd3 saml: Use self.logger in get_issuing_idp.
get_issuing_idp is no longer a class method, so that akward logger
fetching can be skipped and self.logger can be accessed.
2020-07-26 15:49:44 -07:00
Mateusz Mandera
c78bdd6330 saml: Fix incorrect settings object being passed in get_issuing_idp.
Fixes #15904.

settings is supposed to be a proper OneLogin_Saml2_Settings object,
rather than an empty dictionary. This bug wasn't easy to spot because
the codepath that causes this to demonstrate runs only if the
SAMLResponse contains encrypted assertions.
2020-07-26 15:49:43 -07:00
Gittenburg
f4e02f0e80 upload: Do not open compose box when editing.
Previously editing a message and uploading a file in
the edit textarea opened the message compose box.

Fixes #15890.
2020-07-23 11:29:51 -07:00
Gittenburg
77234ef40b message_edit: Fix invisible delete spinner.
Introduced in 953d475274.
2020-07-23 10:25:02 -07:00
Tim Abbott
00f9cd672b docs: Fix versions in stretch=>buster documentation. 2020-07-22 16:36:00 -07:00
Emilio López
c33a7dfff4 email_mirror: Fix exception handling unstructured headers.
This commit rewrites the way addresses are collected. If
the header with the address is not an AddressHeader (for instance,
Delivered-To and Envelope-To), we take its string representation.

Fixes: #15864 ("Error in email_mirror - _UnstructuredHeader has no attribute addresses").
2020-07-22 12:11:38 -07:00
37 changed files with 364 additions and 216 deletions

View File

@@ -10,7 +10,7 @@ def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor)
The backstory is that Django's unique_together indexes do not properly
handle the subgroup=None corner case (allowing duplicate rows that have a
subgroup of None), which meant that in race conditions, rather than updating
an existing row for the property/realm/time with subgroup=None, Django would
an existing row for the property/(realm, stream, user)/time with subgroup=None, Django would
create a duplicate row.
In the next migration, we'll add a proper constraint to fix this bug, but
@@ -20,26 +20,32 @@ def clear_duplicate_counts(apps: StateApps, schema_editor: DatabaseSchemaEditor)
this means deleting the extra rows, but for LoggingCountStat objects, we need to
additionally combine the sums.
"""
RealmCount = apps.get_model('analytics', 'RealmCount')
count_tables = dict(realm=apps.get_model('analytics', 'RealmCount'),
user=apps.get_model('analytics', 'UserCount'),
stream=apps.get_model('analytics', 'StreamCount'),
installation=apps.get_model('analytics', 'InstallationCount'))
realm_counts = RealmCount.objects.filter(subgroup=None).values(
'realm_id', 'property', 'end_time').annotate(
for name, count_table in count_tables.items():
value = [name, 'property', 'end_time']
if name == 'installation':
value = ['property', 'end_time']
counts = count_table.objects.filter(subgroup=None).values(*value).annotate(
Count('id'), Sum('value')).filter(id__count__gt=1)
for realm_count in realm_counts:
realm_count.pop('id__count')
total_value = realm_count.pop('value__sum')
duplicate_counts = list(RealmCount.objects.filter(**realm_count))
first_count = duplicate_counts[0]
if realm_count['property'] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
for count in counts:
count.pop('id__count')
total_value = count.pop('value__sum')
duplicate_counts = list(count_table.objects.filter(**count))
first_count = duplicate_counts[0]
if count['property'] in ["invites_sent::day", "active_users_log:is_bot:day"]:
# For LoggingCountStat objects, the right fix is to combine the totals;
# for other CountStat objects, we expect the duplicates to have the same value.
# And so all we need to do is delete them.
first_count.value = total_value
first_count.save()
to_cleanup = duplicate_counts[1:]
for duplicate_count in to_cleanup:
duplicate_count.delete()
class Migration(migrations.Migration):

View File

@@ -177,7 +177,7 @@ git remote add -f upstream https://github.com/zulip/zulip.git
```
doas pkg_add sudo bash gcc postgresql-server redis rabbitmq \
memcached libmemcached py-Pillow py-cryptography py-cffi
memcached py-Pillow py-cryptography py-cffi
# Point environment to custom include locations and use newer GCC
# (needed for Node modules):

View File

@@ -7,6 +7,48 @@ All notable changes to the Zulip server are documented in this file.
This section lists notable unreleased changes; it is generally updated
in bursts.
### 3.2 -- September 15, 2020
- Switched from `libmemcached` to `python-binary-memcached`, a
pure-Python implementation; this should eliminate memcached
connection problems affecting some installations.
- Removed unnecessary `django-cookies-samesite` dependency, which had
its latest release removed from PyPI (breaking installation of Zulip
3.1).
- Limited which local email addresses Postfix accepts when the
incoming email integration is enabled; this prevents the enumeration
of local users via the email system.
- Fixed incorrectly case-sensitive email validation in `REMOTE_USER`
authentication.
- Fixed search results for `has:image`.
- Fixed ability to adjust "Who can post on the stream" configuration.
- Fixed display of "Permission [to post] will be granted in n days"
for n > 365.
- Support providing `nginx_listen_port` setting in conjunction with
`http_only` in `zulip.conf`.
- Improved upgrade documentation.
- Removed internal ID lists which could leak into the events API.
### 3.1 -- July 30, 2020
- Removed unused `short_name` field from the User model. This field
had no purpose and could leak the local part of email addresses
when email address visiblity was restricted.
- Fixed a bug where loading spinners would sometimes not be displayed.
- Fixed incoming email gateway exception with unstructured headers.
- Fixed AlertWords not being included in data import/export.
- Fixed Twitter previews not including a clear link to the tweet.
- Fixed compose box incorrectly opening after uploading a file in a
message edit widget.
- Fixed exception in SAML integration with encrypted assertions.
- Fixed an analytics migration bug that could cause upgrading from 2.x
releases to fail.
- Added a Thinkst Canary integration (and renamed the old one, which
was actually an integration for canarytokens.org).
- Reformatted the frontend codebase using prettier. This change was
included in this maintenance release to ensure backporting patches
from master remains easy.
### 3.0 -- July 16, 2020
#### Highlights

View File

@@ -65,8 +65,9 @@ su zulip -c '/home/zulip/deployments/current/manage.py backup'
```
The backup tool provides the following options:
- `--output`: Path where the output file should be stored. If no path is
provided, the output file is saved to a temporary directory.
- `--output=/tmp/backup.tar.gz`: Filename to write the backup tarball
to (default: write to a file in `/tmp`). On success, the
console output will show the path to the output tarball.
- `--skip-db`: Skip backup of the database. Useful if you're using a
remote postgres host with its own backup system and just need to
backup non-database state.

View File

@@ -249,9 +249,9 @@ instructions for other supported platforms.
/home/zulip/deployments/current/ --ignore-static-assets --audit-fts-indexes
```
That last command will finish by restarting your Zulip server; you
should now be able to navigate to its URL and confirm everything is
working correctly.
This will finish by restarting your Zulip server; you should now be
able to navigate to its URL and confirm everything is working
correctly.
### Upgrading from Ubuntu 16.04 Xenial to 18.04 Bionic
@@ -278,11 +278,28 @@ working correctly.
systemctl restart memcached
```
5. Same as for Bionic to Focal.
5. Finally, we need to reinstall the current version of Zulip, which
among other things will recompile Zulip's Python module
dependencies for your new version of Python:
That last command will finish by restarting your Zulip server; you
should now be able to navigate to its URL and confirm everything is
working correctly.
```
rm -rf /srv/zulip-venv-cache/*
/home/zulip/deployments/current/scripts/lib/upgrade-zulip-stage-2 \
/home/zulip/deployments/current/ --ignore-static-assets
```
This will finish by restarting your Zulip server; you should now
be able to navigate to its URL and confirm everything is working
correctly.
6. [Upgrade to the latest Zulip release](#upgrading-to-a-release), now
that your server is running a supported operating system.
7. As root, finish by verifying the contents of the full-text indexes:
```
/home/zulip/deployments/current/manage.py audit_fts_indexes
```
### Upgrading from Ubuntu 14.04 Trusty to 16.04 Xenial
@@ -295,7 +312,7 @@ working correctly.
3. Same as for Bionic to Focal.
4. As root, upgrade the database installation and OS configuration to
match the new OS version:
match the new OS version:
```
apt remove upstart -y
@@ -309,11 +326,23 @@ match the new OS version:
service memcached restart
```
5. Same as for Bionic to Focal.
5. Finally, we need to reinstall the current version of Zulip, which
among other things will recompile Zulip's Python module
dependencies for your new version of Python:
That last command will finish by restarting your Zulip server; you
should now be able to navigate to its URL and confirm everything is
working correctly.
```
rm -rf /srv/zulip-venv-cache/*
/home/zulip/deployments/current/scripts/lib/upgrade-zulip-stage-2 \
/home/zulip/deployments/current/ --ignore-static-assets
```
This will finish by restarting your Zulip server; you should now be
able to navigate to its URL and confirm everything is working
correctly.
6. [Upgrade from Xenial to
Bionic](#upgrading-from-ubuntu-16-04-xenial-to-18-04-bionic), so
that you are running a supported operating system.
### Upgrading from Debian Stretch to Debian Buster
@@ -339,20 +368,37 @@ working correctly.
```
apt remove upstart -y
/home/zulip/deployments/current/scripts/zulip-puppet-apply -f
pg_dropcluster 9.5 main --stop
pg_dropcluster 11 main --stop
systemctl stop postgresql
pg_upgradecluster -m upgrade 9.3 main
pg_dropcluster 9.3 main
apt remove postgresql-9.3
pg_upgradecluster -m upgrade 9.6 main
pg_dropcluster 9.6 main
apt remove postgresql-9.6
systemctl start postgresql
service memcached restart
```
5. Same as for Bionic to Focal.
5. Finally, we need to reinstall the current version of Zulip, which
among other things will recompile Zulip's Python module
dependencies for your new version of Python:
That last command will finish by restarting your Zulip server; you
should now be able to navigate to its URL and confirm everything is
working correctly.
```
rm -rf /srv/zulip-venv-cache/*
/home/zulip/deployments/current/scripts/lib/upgrade-zulip-stage-2 \
/home/zulip/deployments/current/ --ignore-static-assets
```
This will finish by restarting your Zulip server; you should now
be able to navigate to its URL and confirm everything is working
correctly.
6. [Upgrade to the latest Zulip release](#upgrading-to-a-release), now
that your server is running a supported operating system.
7. As root, finish by verifying the contents of the full-text indexes:
```
/home/zulip/deployments/current/manage.py audit_fts_indexes
```
## Upgrading PostgreSQL

View File

@@ -720,34 +720,40 @@ run_test("predicate_basics", () => {
// HTML content of message is used to determine if image have link, image or attachment.
// We are using jquery to parse the html and find existence of relevant tags/elements.
// In tests we need to stub the calls to jquery so using zjquery's .set_find_results method.
function set_find_results_for_msg_content(msg, jquery_selector, results) {
$(`<div>${msg.content}</div>`).set_find_results(jquery_selector, results);
}
const has_link = get_predicate([["has", "link"]]);
$(img_msg.content).set_find_results("a", [$("<a>")]);
set_find_results_for_msg_content(img_msg, "a", [$("<a>")]);
assert(has_link(img_msg));
$(non_img_attachment_msg.content).set_find_results("a", [$("<a>")]);
set_find_results_for_msg_content(non_img_attachment_msg, "a", [$("<a>")]);
assert(has_link(non_img_attachment_msg));
$(link_msg.content).set_find_results("a", [$("<a>")]);
set_find_results_for_msg_content(link_msg, "a", [$("<a>")]);
assert(has_link(link_msg));
$(no_has_filter_matching_msg.content).set_find_results("a", false);
set_find_results_for_msg_content(no_has_filter_matching_msg, "a", false);
assert(!has_link(no_has_filter_matching_msg));
const has_attachment = get_predicate([["has", "attachment"]]);
$(img_msg.content).set_find_results("a[href^='/user_uploads']", [$("<a>")]);
set_find_results_for_msg_content(img_msg, "a[href^='/user_uploads']", [$("<a>")]);
assert(has_attachment(img_msg));
$(non_img_attachment_msg.content).set_find_results("a[href^='/user_uploads']", [$("<a>")]);
set_find_results_for_msg_content(non_img_attachment_msg, "a[href^='/user_uploads']", [
$("<a>"),
]);
assert(has_attachment(non_img_attachment_msg));
$(link_msg.content).set_find_results("a[href^='/user_uploads']", false);
set_find_results_for_msg_content(link_msg, "a[href^='/user_uploads']", false);
assert(!has_attachment(link_msg));
$(no_has_filter_matching_msg.content).set_find_results("a[href^='/user_uploads']", false);
set_find_results_for_msg_content(no_has_filter_matching_msg, "a[href^='/user_uploads']", false);
assert(!has_attachment(no_has_filter_matching_msg));
const has_image = get_predicate([["has", "image"]]);
$(img_msg.content).set_find_results(".message_inline_image", [$("<img>")]);
set_find_results_for_msg_content(img_msg, ".message_inline_image", [$("<img>")]);
assert(has_image(img_msg));
$(non_img_attachment_msg.content).set_find_results(".message_inline_image", false);
set_find_results_for_msg_content(non_img_attachment_msg, ".message_inline_image", false);
assert(!has_image(non_img_attachment_msg));
$(link_msg.content).set_find_results(".message_inline_image", false);
set_find_results_for_msg_content(link_msg, ".message_inline_image", false);
assert(!has_image(link_msg));
$(no_has_filter_matching_msg.content).set_find_results(".message_inline_image", false);
set_find_results_for_msg_content(no_has_filter_matching_msg, ".message_inline_image", false);
assert(!has_image(no_has_filter_matching_msg));
});

View File

@@ -0,0 +1,9 @@
# This is the list of email addresses that are accepted via SMTP;
# these consist of only the addresses in `virtual`, as well as the
# RFC822-specified postmaster.
/\+.*@/ OK
/\..*@/ OK
/^mm/ OK
/^postmaster@/ OK

View File

@@ -1,3 +1,6 @@
/\+.*@/ zulip@localhost
/\..*@/ zulip@localhost
/^mm/ zulip@localhost
# Changes to this list require a corresponding change to `access` as
# well.
/\+.*@/ zulip@localhost
/\..*@/ zulip@localhost
/^mm/ zulip@localhost

View File

@@ -5,7 +5,11 @@ class zulip::app_frontend {
include zulip::app_frontend_once
$nginx_http_only = zulipconf('application_server', 'http_only', undef)
$nginx_listen_port = zulipconf('application_server', 'nginx_listen_port', 443)
if $nginx_http_only != '' {
$nginx_listen_port = zulipconf('application_server', 'nginx_listen_port', 80)
} else {
$nginx_listen_port = zulipconf('application_server', 'nginx_listen_port', 443)
}
$no_serve_uploads = zulipconf('application_server', 'no_serve_uploads', undef)
$ssl_dir = $::osfamily ? {
'debian' => '/etc/ssl',

View File

@@ -151,15 +151,4 @@ class zulip::app_frontend_base {
mode => '0755',
source => 'puppet:///modules/zulip/nagios_plugins/zulip_app_frontend',
}
if $::osfamily == 'debian' {
# The pylibmc wheel looks for SASL plugins in the wrong place.
file { '/usr/lib64':
ensure => directory,
}
file { '/usr/lib64/sasl2':
ensure => link,
target => "/usr/lib/${::rubyplatform}/sasl2",
}
}
}

View File

@@ -67,4 +67,12 @@ class zulip::postfix_localmail {
],
}
file {'/etc/postfix/access':
ensure => file,
mode => '0644',
owner => root,
group => root,
source => 'puppet:///modules/zulip/postfix/access',
require => Package[postfix],
}
}

View File

@@ -16,8 +16,8 @@ include /etc/nginx/zulip-include/upstreams;
server {
<% if @nginx_http_only != '' -%>
listen 80;
listen [::]:80;
listen <%= @nginx_listen_port %>;
listen [::]:<%= @nginx_listen_port %>;
<% else -%>
listen <%= @nginx_listen_port %> http2;
listen [::]:<%= @nginx_listen_port %> http2;

View File

@@ -16,6 +16,7 @@ smtpd_tls_session_cache_database = btree:${data_directory}/smtpd_scache
smtp_tls_session_cache_database = btree:${data_directory}/smtp_scache
smtpd_relay_restrictions = permit_mynetworks permit_sasl_authenticated reject_unauth_destination
smtpd_recipient_restrictions = check_recipient_access regexp:/etc/postfix/access, reject
myhostname = <%= @fqdn %>
alias_maps = hash:/etc/aliases
alias_database = hash:/etc/aliases

View File

@@ -79,10 +79,11 @@ pika
psycopg2 --no-binary psycopg2
# Needed for memcached usage
pylibmc
# https://github.com/jaysonsantos/python-binary-memcached/pull/230, https://github.com/jaysonsantos/python-binary-memcached/pull/231
https://github.com/jaysonsantos/python-binary-memcached/archive/364ce723ea73290a6ae27551cab28070424fd280.zip#egg=python-binary-memcached==0.29.0+git
# Needed for compression support in memcached via pylibmc
django-pylibmc
# Needed for compression support in memcached via python-binary-memcached
django-bmemcached
# Needed for zerver/tests/test_timestamp.py
python-dateutil
@@ -177,9 +178,6 @@ pyahocorasick
# Used for rate limiting authentication.
decorator
# Use SameSite cookies in legacy Django (remove with Django 2.1)
django-cookies-samesite
# For server-side enforcement of password strength
zxcvbn

View File

@@ -273,8 +273,8 @@ django-bitfield==2.0.1 \
--hash=sha256:83bfa27da718caff436f646369ce58e2d9f922e1f3d65a93f0b731a835cbfc58 \
--hash=sha256:ab340eb50cdb1e8c005594b9f8170a95a698102d06cf3f5031763be2750a8862 \
# via -r requirements/common.in
django-cookies-samesite==0.6.6 \
--hash=sha256:a26dc27bfc446279c981a301b053eff845b93d9ba62798e281c90584a7ccaa4a \
django-bmemcached==0.3.0 \
--hash=sha256:4e4b7d97216dbae331c1de10e699ca22804b94ec3a90d2762dd5d146e6986a8a \
# via -r requirements/common.in
django-formtools==2.2 \
--hash=sha256:304fa777b8ef9e0693ce7833f885cb89ba46b0e46fc23b01176900a93f46742f \
@@ -288,10 +288,6 @@ django-phonenumber-field==3.0.1 \
--hash=sha256:1ab19f723928582fed412bd9844221fa4ff466276d8526b8b4a9913ee1487c5e \
--hash=sha256:794ebbc3068a7af75aa72a80cb0cec67e714ff8409a965968040f1fd210b2d97 \
# via django-two-factor-auth
django-pylibmc==0.6.1 \
--hash=sha256:02b591933a029eb552388cced713028f3c6cbb021639fc8de388bd1ca87981d4 \
--hash=sha256:9cffdee703aaf9ebc029d9dbdee8abdd0723564b95e4b2ac59e4a668b8e58f93 \
# via -r requirements/common.in
django-sendfile2==0.6.0 \
--hash=sha256:7f850040ddc29c9c42192ed85b915465a3ed7cced916c4fafdd5eda057dd06ec \
# via -r requirements/common.in
@@ -775,13 +771,6 @@ pyjwt==1.7.1 \
--hash=sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e \
--hash=sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96 \
# via -r requirements/common.in, apns2, social-auth-core, twilio
pylibmc==1.6.1 \
--hash=sha256:01a7e2e3fa9fcd7a791c7818a80a07e7a381aee988a5d810a1c1e6f7a9a288fd \
--hash=sha256:6fff384e3c30af029bbac87f88b3fab14ae87b50103d389341d9b3e633349a3f \
--hash=sha256:8a8dd406487d419d58c6d944efd91e8189b360a0c4d9e8c6ebe3990d646ae7e9 \
--hash=sha256:c749b4251c1137837d00542b62992b96cd2aed639877407f66291120dd6de2ff \
--hash=sha256:e6c0c452336db0868d0de521d48872c2a359b1233b974c6b32c36ce68abc4820 \
# via -r requirements/common.in, django-pylibmc
pyoembed==0.1.2 \
--hash=sha256:0f755c8308039f1e49238e95ea94ef16aa08add9f32075ba13ab9b65f32ff582 \
# via -r requirements/common.in
@@ -796,6 +785,9 @@ pyparsing==2.4.7 \
pyrsistent==0.16.0 \
--hash=sha256:28669905fe725965daa16184933676547c5bb40a5153055a8dee2a4bd7933ad3 \
# via jsonschema
https://github.com/jaysonsantos/python-binary-memcached/archive/364ce723ea73290a6ae27551cab28070424fd280.zip#egg=python-binary-memcached==0.29.0+git \
--hash=sha256:0e6f4c7c34c71e29e1daa53cca6598dcbdb8bd49d7c6aaac6c02d93bdc2d5a8f \
# via -r requirements/common.in, django-bmemcached
python-dateutil==2.8.1 \
--hash=sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c \
--hash=sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a \
@@ -925,7 +917,7 @@ sh==1.12.14 \
six==1.15.0 \
--hash=sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259 \
--hash=sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced \
# via argon2-cffi, automat, aws-sam-translator, cfn-lint, cryptography, django-bitfield, docker, ecdsa, hypchat, isodate, jsonschema, junit-xml, libthumbor, moto, openapi-core, openapi-schema-validator, openapi-spec-validator, packaging, parsel, pip-tools, protego, pyopenssl, python-dateutil, python-debian, python-jose, qrcode, responses, social-auth-app-django, social-auth-core, talon, traitlets, twilio, w3lib, websocket-client, zulip
# via argon2-cffi, automat, aws-sam-translator, cfn-lint, cryptography, django-bitfield, docker, ecdsa, hypchat, isodate, jsonschema, junit-xml, libthumbor, moto, openapi-core, openapi-schema-validator, openapi-spec-validator, packaging, parsel, pip-tools, protego, pyopenssl, python-binary-memcached, python-dateutil, python-debian, python-jose, qrcode, responses, social-auth-app-django, social-auth-core, talon, traitlets, twilio, w3lib, websocket-client, zulip
snakeviz==2.1.0 \
--hash=sha256:8ce375b18ae4a749516d7e6c6fbbf8be6177c53974f53534d8eadb646cd279b1 \
--hash=sha256:92ad876fb6a201a7e23a6b85ea96d9643a51e285667c253a8653643804f7cb68 \
@@ -1111,10 +1103,9 @@ typing-extensions==3.7.4.2 \
--hash=sha256:79ee589a3caca649a9bfd2a8de4709837400dfa00b6cc81962a1e6a1815969ae \
--hash=sha256:f8d2bd89d25bc39dabe7d23df520442fa1d8969b82544370e03d88b5a591c392 \
# via -r requirements/common.in, mypy, zulint
ua-parser==0.10.0 \
--hash=sha256:46ab2e383c01dbd2ab284991b87d624a26a08f72da4d7d413f5bfab8b9036f8a \
--hash=sha256:47b1782ed130d890018d983fac37c2a80799d9e0b9c532e734c67cf70f185033 \
# via django-cookies-samesite
uhashring==1.2 \
--hash=sha256:f7304ca2ff763bbf1e2f8a78f21131721811619c5841de4f8c98063344906931 \
# via python-binary-memcached
https://github.com/zulip/ultrajson/archive/70ac02becc3e11174cd5072650f885b30daab8a8.zip#egg=ujson==1.35+git \
--hash=sha256:e95c20f47093dc7376ddf70b95489979375fb6e88b8d7e4b5576d917dda8ef5a \
# via -r requirements/common.in

View File

@@ -185,8 +185,8 @@ django-bitfield==2.0.1 \
--hash=sha256:83bfa27da718caff436f646369ce58e2d9f922e1f3d65a93f0b731a835cbfc58 \
--hash=sha256:ab340eb50cdb1e8c005594b9f8170a95a698102d06cf3f5031763be2750a8862 \
# via -r requirements/common.in
django-cookies-samesite==0.6.6 \
--hash=sha256:a26dc27bfc446279c981a301b053eff845b93d9ba62798e281c90584a7ccaa4a \
django-bmemcached==0.3.0 \
--hash=sha256:4e4b7d97216dbae331c1de10e699ca22804b94ec3a90d2762dd5d146e6986a8a \
# via -r requirements/common.in
django-formtools==2.2 \
--hash=sha256:304fa777b8ef9e0693ce7833f885cb89ba46b0e46fc23b01176900a93f46742f \
@@ -200,10 +200,6 @@ django-phonenumber-field==3.0.1 \
--hash=sha256:1ab19f723928582fed412bd9844221fa4ff466276d8526b8b4a9913ee1487c5e \
--hash=sha256:794ebbc3068a7af75aa72a80cb0cec67e714ff8409a965968040f1fd210b2d97 \
# via django-two-factor-auth
django-pylibmc==0.6.1 \
--hash=sha256:02b591933a029eb552388cced713028f3c6cbb021639fc8de388bd1ca87981d4 \
--hash=sha256:9cffdee703aaf9ebc029d9dbdee8abdd0723564b95e4b2ac59e4a668b8e58f93 \
# via -r requirements/common.in
django-sendfile2==0.6.0 \
--hash=sha256:7f850040ddc29c9c42192ed85b915465a3ed7cced916c4fafdd5eda057dd06ec \
# via -r requirements/common.in
@@ -554,13 +550,6 @@ pyjwt==1.7.1 \
--hash=sha256:5c6eca3c2940464d106b99ba83b00c6add741c9becaec087fb7ccdefea71350e \
--hash=sha256:8d59a976fb773f3e6a39c85636357c4f0e242707394cadadd9814f5cbaa20e96 \
# via -r requirements/common.in, apns2, social-auth-core, twilio
pylibmc==1.6.1 \
--hash=sha256:01a7e2e3fa9fcd7a791c7818a80a07e7a381aee988a5d810a1c1e6f7a9a288fd \
--hash=sha256:6fff384e3c30af029bbac87f88b3fab14ae87b50103d389341d9b3e633349a3f \
--hash=sha256:8a8dd406487d419d58c6d944efd91e8189b360a0c4d9e8c6ebe3990d646ae7e9 \
--hash=sha256:c749b4251c1137837d00542b62992b96cd2aed639877407f66291120dd6de2ff \
--hash=sha256:e6c0c452336db0868d0de521d48872c2a359b1233b974c6b32c36ce68abc4820 \
# via -r requirements/common.in, django-pylibmc
pyoembed==0.1.2 \
--hash=sha256:0f755c8308039f1e49238e95ea94ef16aa08add9f32075ba13ab9b65f32ff582 \
# via -r requirements/common.in
@@ -571,6 +560,9 @@ pyopenssl==19.1.0 \
pyrsistent==0.16.0 \
--hash=sha256:28669905fe725965daa16184933676547c5bb40a5153055a8dee2a4bd7933ad3 \
# via jsonschema
https://github.com/jaysonsantos/python-binary-memcached/archive/364ce723ea73290a6ae27551cab28070424fd280.zip#egg=python-binary-memcached==0.29.0+git \
--hash=sha256:0e6f4c7c34c71e29e1daa53cca6598dcbdb8bd49d7c6aaac6c02d93bdc2d5a8f \
# via -r requirements/common.in, django-bmemcached
python-dateutil==2.8.1 \
--hash=sha256:73ebfe9dbf22e832286dafa60473e4cd239f8592f699aa5adaf10050e6e1823c \
--hash=sha256:75bb3f31ea686f1197762692a9ee6a7550b59fc6ca3a1f4b5d7e32fb98e2da2a \
@@ -661,7 +653,7 @@ s3transfer==0.3.3 \
six==1.15.0 \
--hash=sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259 \
--hash=sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced \
# via argon2-cffi, cryptography, django-bitfield, hypchat, isodate, jsonschema, libthumbor, openapi-core, openapi-schema-validator, openapi-spec-validator, pyopenssl, python-dateutil, qrcode, social-auth-app-django, social-auth-core, talon, traitlets, twilio, zulip
# via argon2-cffi, cryptography, django-bitfield, hypchat, isodate, jsonschema, libthumbor, openapi-core, openapi-schema-validator, openapi-spec-validator, pyopenssl, python-binary-memcached, python-dateutil, qrcode, social-auth-app-django, social-auth-core, talon, traitlets, twilio, zulip
social-auth-app-django==4.0.0 \
--hash=sha256:2c69e57df0b30c9c1823519c5f1992cbe4f3f98fdc7d95c840e091a752708840 \
--hash=sha256:567ad0e028311541d7dfed51d3bf2c60440a6fd236d5d4d06c5a618b3d6c57c5 \
@@ -747,10 +739,9 @@ typing-extensions==3.7.4.2 \
--hash=sha256:79ee589a3caca649a9bfd2a8de4709837400dfa00b6cc81962a1e6a1815969ae \
--hash=sha256:f8d2bd89d25bc39dabe7d23df520442fa1d8969b82544370e03d88b5a591c392 \
# via -r requirements/common.in
ua-parser==0.10.0 \
--hash=sha256:46ab2e383c01dbd2ab284991b87d624a26a08f72da4d7d413f5bfab8b9036f8a \
--hash=sha256:47b1782ed130d890018d983fac37c2a80799d9e0b9c532e734c67cf70f185033 \
# via django-cookies-samesite
uhashring==1.2 \
--hash=sha256:f7304ca2ff763bbf1e2f8a78f21131721811619c5841de4f8c98063344906931 \
# via python-binary-memcached
https://github.com/zulip/ultrajson/archive/70ac02becc3e11174cd5072650f885b30daab8a8.zip#egg=ujson==1.35+git \
--hash=sha256:e95c20f47093dc7376ddf70b95489979375fb6e88b8d7e4b5576d917dda8ef5a \
# via -r requirements/common.in

View File

@@ -14,7 +14,6 @@ done
is_centos=false
is_rhel=false
is_rhel_registered=false
if [ -e /etc/centos-release ]; then
is_centos=true
yum install -y epel-release
@@ -27,12 +26,6 @@ if [ -e /etc/centos-release ]; then
elif grep -q "Red Hat" /etc/redhat-release; then
is_rhel=true
yum localinstall -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
if subscription-manager status; then
# See https://access.redhat.com/discussions/2217891#comment-1032701
is_rhel_registered=true
# libmemcached-devel can be installed directly if the machine is registered
subscription-manager repos --enable "rhel-*-optional-rpms" --enable "rhel-*-extras-rpms"
fi
fi
yum update -y
@@ -51,10 +44,6 @@ if [ "$is_centos" = true ]; then
# https://pgroonga.github.io/install/centos.html
yum localinstall -y https://packages.groonga.org/centos/groonga-release-latest.noarch.rpm
elif [ "$is_rhel" = true ]; then
if [ "$is_rhel_registered" = false ]; then
echo "This machine is unregistered; installing libmemcached-devel from a CentOS mirror ..."
yum localinstall -y http://mirror.centos.org/centos/7/os/x86_64/Packages/libmemcached-devel-1.0.16-5.el7.x86_64.rpm
fi
yum localinstall -y https://download.postgresql.org/pub/repos/yum/10/redhat/rhel-latest-x86_64/pgdg-redhat10-10-2.noarch.rpm
yum localinstall -y https://packages.groonga.org/centos/groonga-release-latest.noarch.rpm
else

View File

@@ -17,7 +17,6 @@ VENV_DEPENDENCIES = [
"zlib1g-dev", # Needed to handle compressed PNGs with Pillow
"libjpeg-dev", # Needed to handle JPEGs with Pillow
"libldap2-dev",
"libmemcached-dev",
"python3-dev", # Needed to install typed-ast dependency of mypy
"python3-pip",
"virtualenv",
@@ -35,6 +34,8 @@ VENV_DEPENDENCIES = [
# on upgrade of a production server, and it's not worth adding
# another call to `apt install` for.
"jq", # Used by scripts/lib/install-node to check yarn version
"libsasl2-dev", # For building python-ldap from source
]
COMMON_YUM_VENV_DEPENDENCIES = [
@@ -43,7 +44,6 @@ COMMON_YUM_VENV_DEPENDENCIES = [
"zlib-devel",
"libjpeg-turbo-devel",
"openldap-devel",
"libmemcached-devel",
# Needed by python-xmlsec:
"gcc"
"python3-devel",

View File

@@ -9,15 +9,10 @@ from scripts.lib.setup_path import setup_path
setup_path()
import pylibmc
import bmemcached
from zproject import settings
assert isinstance(settings.CACHES["default"], dict) # for mypy
pylibmc.Client(
[settings.MEMCACHED_LOCATION],
binary=True,
username=settings.MEMCACHED_USERNAME,
password=settings.MEMCACHED_PASSWORD,
behaviors=settings.CACHES["default"]["OPTIONS"],
).flush_all()
cache = settings.CACHES["default"]
assert isinstance(cache, dict) # for mypy
bmemcached.Client((cache["LOCATION"],), **cache["OPTIONS"]).flush_all()

View File

@@ -538,7 +538,7 @@ function validate_stream_message_post_policy(sub) {
const person = people.get_by_user_id(page_params.user_id);
const current_datetime = new Date(Date.now());
const person_date_joined = new Date(person.date_joined);
const days = new Date(current_datetime - person_date_joined).getDate();
const days = (current_datetime - person_date_joined) / 1000 / 86400;
let error_text;
if (
stream_post_policy === stream_post_permission_type.non_new_members.code &&

View File

@@ -16,16 +16,25 @@ function add_messages(messages, msg_list, opts) {
return render_info;
}
// We need to check if the message content contains the specified HTML
// elements. We wrap the message.content in a <div>; this is
// important because $("Text <a>link</a>").find("a") returns nothing;
// one needs an outer element wrapping an object to use this
// construction.
function is_element_in_message_content(message, element_selector) {
return $(`<div>${message.content}</div>`).find(element_selector).length > 0;
}
exports.message_has_link = function (message) {
return $(message.content).find("a").length > 0;
return is_element_in_message_content(message, "a");
};
exports.message_has_image = function (message) {
return $(message.content).find(".message_inline_image").length > 0;
return is_element_in_message_content(message, ".message_inline_image");
};
exports.message_has_attachment = function (message) {
return $(message.content).find("a[href^='/user_uploads']").length > 0;
return is_element_in_message_content(message, "a[href^='/user_uploads']");
};
exports.add_old_messages = function (messages, msg_list) {

View File

@@ -480,11 +480,22 @@ exports.set_stream_property = function (sub, property, value, status_element) {
exports.bulk_set_stream_property([sub_data], status_element);
};
function get_message_retention_days_from_sub(sub) {
if (sub.message_retention_days === null) {
return "realm_default";
}
if (sub.message_retention_days === -1) {
return "forever";
}
return sub.message_retention_days;
}
function change_stream_privacy(e) {
e.stopPropagation();
const stream_id = $(e.target).data("stream-id");
const sub = stream_data.get_sub_by_id(stream_id);
const data = {};
const privacy_setting = $("#stream_privacy_modal input[name=privacy]:checked").val();
const stream_post_policy = parseInt(
@@ -492,6 +503,10 @@ function change_stream_privacy(e) {
10,
);
if (sub.stream_post_policy !== stream_post_policy) {
data.stream_post_policy = JSON.stringify(stream_post_policy);
}
let invite_only;
let history_public_to_subscribers;
@@ -506,28 +521,38 @@ function change_stream_privacy(e) {
history_public_to_subscribers = true;
}
$(".stream_change_property_info").hide();
const data = {
stream_name: sub.name,
// toggle the privacy setting
is_private: JSON.stringify(invite_only),
stream_post_policy: JSON.stringify(stream_post_policy),
history_public_to_subscribers: JSON.stringify(history_public_to_subscribers),
};
if (
sub.invite_only !== invite_only ||
sub.history_public_to_subscribers !== history_public_to_subscribers
) {
data.is_private = JSON.stringify(invite_only);
data.history_public_to_subscribers = JSON.stringify(history_public_to_subscribers);
}
if (page_params.is_owner) {
let message_retention_days = $(
"#stream_privacy_modal select[name=stream_message_retention_setting]",
).val();
if (message_retention_days === "retain_for_period") {
message_retention_days = parseInt(
$("#stream_privacy_modal input[name=stream-message-retention-days]").val(),
10,
);
}
let message_retention_days = $(
"#stream_privacy_modal select[name=stream_message_retention_setting]",
).val();
if (message_retention_days === "retain_for_period") {
message_retention_days = parseInt(
$("#stream_privacy_modal input[name=stream-message-retention-days]").val(),
10,
);
}
const message_retention_days_from_sub = get_message_retention_days_from_sub(sub);
if (message_retention_days_from_sub !== message_retention_days) {
data.message_retention_days = JSON.stringify(message_retention_days);
}
$(".stream_change_property_info").hide();
if (Object.keys(data).length === 0) {
overlays.close_modal("#stream_privacy_modal");
$("#stream_privacy_modal").remove();
return;
}
channel.patch({
url: "/json/streams/" + stream_id,
data: data,

View File

@@ -228,7 +228,7 @@ exports.setup_upload = function (config) {
}
const split_uri = uri.split("/");
const filename = split_uri[split_uri.length - 1];
if (!compose_state.composing()) {
if (config.mode === "compose" && !compose_state.composing()) {
compose_actions.start("stream");
}
const absolute_uri = exports.make_upload_absolute(uri);

View File

@@ -2133,8 +2133,6 @@ div.topic_edit_spinner .loading_indicator_spinner {
}
#do_delete_message_spinner {
display: none;
width: 0;
margin: 0 auto;
}

View File

@@ -92,7 +92,7 @@ RUN apt-get update \
memcached rabbitmq-server redis-server \
hunspell-en-us supervisor libssl-dev puppet \
gettext libffi-dev libfreetype6-dev zlib1g-dev \
libjpeg-dev libldap2-dev libmemcached-dev \
libjpeg-dev libldap2-dev \
libxml2-dev libxslt1-dev libpq-dev moreutils \
{extra_packages}

View File

@@ -52,7 +52,7 @@ run apt-get install -y --no-install-recommends \
memcached redis-server \
hunspell-en-us supervisor libssl-dev puppet \
gettext libffi-dev libfreetype6-dev zlib1g-dev libjpeg-dev \
libldap2-dev libmemcached-dev \
libldap2-dev \
libxml2-dev libxslt1-dev libpq-dev \
virtualenv \
"${extra_packages[@]}"

View File

@@ -1,6 +1,6 @@
import os
ZULIP_VERSION = "4.0-dev+git"
ZULIP_VERSION = "3.2"
# Add information on number of commits and commit hash to version, if available
zulip_git_version_file = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'zulip-git-version')
if os.path.exists(zulip_git_version_file):
@@ -10,7 +10,7 @@ if os.path.exists(zulip_git_version_file):
ZULIP_VERSION = version
LATEST_MAJOR_VERSION = "3.0"
LATEST_RELEASE_VERSION = "3.0"
LATEST_RELEASE_VERSION = "3.2"
LATEST_RELEASE_ANNOUNCEMENT = "https://blog.zulip.org/2020/07/16/zulip-3-0-released/"
LATEST_DESKTOP_VERSION = "5.3.0"
@@ -44,4 +44,4 @@ API_FEATURE_LEVEL = 27
# historical commits sharing the same major version, in which case a
# minor version bump suffices.
PROVISION_VERSION = '90.1'
PROVISION_VERSION = '92.0'

View File

@@ -1,5 +1,6 @@
import logging
import re
from email.headerregistry import AddressHeader
from email.message import EmailMessage
from typing import Dict, List, Optional, Tuple
@@ -310,9 +311,14 @@ def find_emailgateway_recipient(message: EmailMessage) -> str:
for header_name in recipient_headers:
for header_value in message.get_all(header_name, []):
for addr in header_value.addresses:
if match_email_re.match(addr.addr_spec):
return addr.addr_spec
if isinstance(header_value, AddressHeader):
emails = [addr.addr_spec for addr in header_value.addresses]
else:
emails = [str(header_value)]
for email in emails:
if match_email_re.match(email):
return email
raise ZulipEmailForwardError("Missing recipient in mirror email")

View File

@@ -3581,6 +3581,14 @@ class TestZulipRemoteUserBackend(DesktopFlowTestingLib, ZulipTestCase):
self.assertEqual(result.status_code, 302)
self.assert_logged_in_user_id(user_profile.id)
def test_login_case_insensitive(self) -> None:
user_profile = self.example_user('hamlet')
email_upper = user_profile.delivery_email.upper()
with self.settings(AUTHENTICATION_BACKENDS=('zproject.backends.ZulipRemoteUserBackend',)):
result = self.client_get('/accounts/login/sso/', REMOTE_USER=email_upper)
self.assertEqual(result.status_code, 302)
self.assert_logged_in_user_id(user_profile.id)
def test_login_failure(self) -> None:
email = self.example_email("hamlet")
result = self.client_get('/accounts/login/sso/', REMOTE_USER=email)

View File

@@ -232,6 +232,36 @@ class TestStreamEmailMessagesSuccess(ZulipTestCase):
self.assertEqual(get_display_recipient(message.recipient), stream.name)
self.assertEqual(message.topic_name(), incoming_valid_message['Subject'])
# Test receiving an email with the address on an UnstructuredHeader
# (e.g. Envelope-To) instead of an AddressHeader (e.g. To).
# https://github.com/zulip/zulip/issues/15864
def test_receive_stream_email_messages_other_header_success(self) -> None:
user_profile = self.example_user('hamlet')
self.login_user(user_profile)
self.subscribe(user_profile, "Denmark")
stream = get_stream("Denmark", user_profile.realm)
stream_to_address = encode_email_address(stream)
incoming_valid_message = EmailMessage()
incoming_valid_message.set_content('TestStreamEmailMessages Body')
incoming_valid_message['Subject'] = 'TestStreamEmailMessages Subject'
incoming_valid_message['From'] = self.example_email('hamlet')
# Simulate a mailing list
incoming_valid_message['To'] = "foo-mailinglist@example.com"
incoming_valid_message['Envelope-To'] = stream_to_address
incoming_valid_message['Reply-to'] = self.example_email('othello')
process_message(incoming_valid_message)
# Hamlet is subscribed to this stream so should see the email message from Othello.
message = most_recent_message(user_profile)
self.assertEqual(message.content, "TestStreamEmailMessages Body")
self.assertEqual(get_display_recipient(message.recipient), stream.name)
self.assertEqual(message.topic_name(), incoming_valid_message['Subject'])
def test_receive_stream_email_messages_blank_subject_success(self) -> None:
user_profile = self.example_user('hamlet')
self.login_user(user_profile)

View File

@@ -408,13 +408,6 @@ class NormalActionsTest(BaseAction):
('edit_timestamp', check_int),
('message_id', check_int),
('message_ids', check_list(check_int)),
('prior_mention_user_ids', check_list(check_int)),
('mention_user_ids', check_list(check_int)),
('wildcard_mention_user_ids', check_list(check_int)),
('presence_idle_user_ids', check_list(check_int)),
('stream_push_user_ids', check_list(check_int)),
('stream_email_user_ids', check_list(check_int)),
('push_notify_user_ids', check_list(check_int)),
('orig_content', check_string),
('orig_rendered_content', check_string),
(ORIG_TOPIC, check_string),

View File

@@ -4098,15 +4098,16 @@ class TestFindMyTeam(ZulipTestCase):
self.assertIn("Find your Zulip accounts", result.content.decode('utf8'))
def test_result(self) -> None:
# We capitalize a letter in cordelia's email to test that the search is case-insensitive.
result = self.client_post('/accounts/find/',
dict(emails="iago@zulip.com,cordelia@zulip.com"))
dict(emails="iago@zulip.com,cordeliA@zulip.com"))
self.assertEqual(result.status_code, 302)
self.assertEqual(result.url, "/accounts/find/?emails=iago%40zulip.com%2Ccordelia%40zulip.com")
self.assertEqual(result.url, "/accounts/find/?emails=iago%40zulip.com%2CcordeliA%40zulip.com")
result = self.client_get(result.url)
content = result.content.decode('utf8')
self.assertIn("Emails sent! You will only receive emails", content)
self.assertIn(self.example_email("iago"), content)
self.assertIn(self.example_email("cordelia"), content)
self.assertIn("iago@zulip.com", content)
self.assertIn("cordeliA@zulip.com", content)
from django.core.mail import outbox
# 3 = 1 + 2 -- Cordelia gets an email each for the "zulip" and "lear" realms.

View File

@@ -1007,15 +1007,18 @@ def process_deletion_event(event: Mapping[str, Any], users: Iterable[int]) -> No
del compatibility_event['message_ids']
client.add_event(compatibility_event)
def process_message_update_event(event_template: Mapping[str, Any],
def process_message_update_event(orig_event: Mapping[str, Any],
users: Iterable[Mapping[str, Any]]) -> None:
prior_mention_user_ids = set(event_template.get('prior_mention_user_ids', []))
mention_user_ids = set(event_template.get('mention_user_ids', []))
presence_idle_user_ids = set(event_template.get('presence_idle_user_ids', []))
stream_push_user_ids = set(event_template.get('stream_push_user_ids', []))
stream_email_user_ids = set(event_template.get('stream_email_user_ids', []))
wildcard_mention_user_ids = set(event_template.get('wildcard_mention_user_ids', []))
push_notify_user_ids = set(event_template.get('push_notify_user_ids', []))
# Extract the parameters passed via the event object that don't
# belong in the actual events.
event_template = dict(orig_event)
prior_mention_user_ids = set(event_template.pop('prior_mention_user_ids', []))
mention_user_ids = set(event_template.pop('mention_user_ids', []))
presence_idle_user_ids = set(event_template.pop('presence_idle_user_ids', []))
stream_push_user_ids = set(event_template.pop('stream_push_user_ids', []))
stream_email_user_ids = set(event_template.pop('stream_email_user_ids', []))
wildcard_mention_user_ids = set(event_template.pop('wildcard_mention_user_ids', []))
push_notify_user_ids = set(event_template.pop('push_notify_user_ids', []))
stream_name = event_template.get('stream_name')
message_id = event_template['message_id']

View File

@@ -7,6 +7,7 @@ from django.conf import settings
from django.contrib.auth import authenticate, get_backends
from django.core import validators
from django.core.exceptions import ValidationError
from django.db.models import Q
from django.http import HttpRequest, HttpResponse, HttpResponseRedirect
from django.shortcuts import redirect, render
from django.urls import reverse
@@ -601,8 +602,15 @@ def find_account(request: HttpRequest) -> HttpResponse:
form = FindMyTeamForm(request.POST)
if form.is_valid():
emails = form.cleaned_data['emails']
# Django doesn't support __iexact__in lookup with EmailField, so we have
# to use Qs to get around that without needing to do multiple queries.
emails_q = Q()
for email in emails:
emails_q |= Q(delivery_email__iexact=email)
for user in UserProfile.objects.filter(
delivery_email__in=emails, is_active=True, is_bot=False,
emails_q, is_active=True, is_bot=False,
realm__deactivated=False):
context = common_context(user)
context.update({

View File

@@ -4,7 +4,7 @@ import random
from datetime import datetime
from typing import Any, Callable, Dict, List, Mapping, Sequence, Tuple
import pylibmc
import bmemcached
import ujson
from django.conf import settings
from django.contrib.sessions.models import Session
@@ -79,13 +79,9 @@ def clear_database() -> None:
# With `zproject.test_settings`, we aren't using real memcached
# and; we only need to flush memcached if we're populating a
# database that would be used with it (i.e. zproject.dev_settings).
if default_cache['BACKEND'] == 'django_pylibmc.memcached.PyLibMCCache':
pylibmc.Client(
[default_cache['LOCATION']],
binary=True,
username=default_cache["USERNAME"],
password=default_cache["PASSWORD"],
behaviors=default_cache["OPTIONS"],
if default_cache['BACKEND'] == 'django_bmemcached.memcached.BMemcached':
bmemcached.Client(
(default_cache['LOCATION'],), **default_cache['OPTIONS'],
).flush_all()
model: Any = None # Hack because mypy doesn't know these are model classes

View File

@@ -38,6 +38,7 @@ from jwt.exceptions import PyJWTError
from lxml.etree import XMLSyntaxError
from onelogin.saml2.errors import OneLogin_Saml2_Error
from onelogin.saml2.response import OneLogin_Saml2_Response
from onelogin.saml2.settings import OneLogin_Saml2_Settings
from requests import HTTPError
from social_core.backends.apple import AppleIdAuth
from social_core.backends.azuread import AzureADOAuth2
@@ -987,7 +988,7 @@ class ExternalAuthResult:
if self.user_profile is not None:
# Ensure data inconsistent with the user_profile wasn't passed in inside the data_dict argument.
assert 'full_name' not in data_dict or data_dict['full_name'] == self.user_profile.full_name
assert 'email' not in data_dict or data_dict['email'] == self.user_profile.delivery_email
assert 'email' not in data_dict or data_dict['email'].lower() == self.user_profile.delivery_email.lower()
# Update these data_dict fields to ensure consistency with self.user_profile. This is mostly
# defensive code, but is useful in these scenarios:
# 1. user_profile argument was passed in, and no full_name or email_data in the data_dict arg.
@@ -1774,8 +1775,7 @@ class SAMLAuthBackend(SocialAuthMixin, SAMLAuth):
return data
@classmethod
def get_issuing_idp(cls, SAMLResponse: str) -> Optional[str]:
def get_issuing_idp(self, SAMLResponse: str) -> Optional[str]:
"""
Given a SAMLResponse, returns which of the configured IdPs is declared as the issuer.
This value MUST NOT be trusted as the true issuer!
@@ -1786,11 +1786,12 @@ class SAMLAuthBackend(SocialAuthMixin, SAMLAuth):
of the configured IdPs' information to use for parsing and validating the response.
"""
try:
resp = OneLogin_Saml2_Response(settings={}, response=SAMLResponse)
config = self.generate_saml_config()
saml_settings = OneLogin_Saml2_Settings(config, sp_validation_only=True)
resp = OneLogin_Saml2_Response(settings=saml_settings, response=SAMLResponse)
issuers = resp.get_issuers()
except cls.SAMLRESPONSE_PARSING_EXCEPTIONS:
logger = logging.getLogger(f"zulip.auth.{cls.name}")
logger.info("Error while parsing SAMLResponse:", exc_info=True)
except self.SAMLRESPONSE_PARSING_EXCEPTIONS:
self.logger.info("Error while parsing SAMLResponse:", exc_info=True)
return None
for idp_name, idp_config in settings.SOCIAL_AUTH_SAML_ENABLED_IDPS.items():

View File

@@ -326,24 +326,17 @@ RABBITMQ_PASSWORD = get_secret("rabbitmq_password")
SESSION_ENGINE = "django.contrib.sessions.backends.cached_db"
# Compress large values being stored in memcached; this is important
# for at least the realm_users cache.
PYLIBMC_MIN_COMPRESS_LEN = 100 * 1024
PYLIBMC_COMPRESS_LEVEL = 1
MEMCACHED_PASSWORD = get_secret("memcached_password")
CACHES = {
'default': {
'BACKEND': 'django_pylibmc.memcached.PyLibMCCache',
'BACKEND': 'django_bmemcached.memcached.BMemcached',
'LOCATION': MEMCACHED_LOCATION,
'TIMEOUT': 3600,
'BINARY': True,
'USERNAME': MEMCACHED_USERNAME,
'PASSWORD': MEMCACHED_PASSWORD,
'OPTIONS': {
'tcp_nodelay': True,
'retry_timeout': 1,
'socket_timeout': 3600,
'username': MEMCACHED_USERNAME,
'password': MEMCACHED_PASSWORD,
'pickle_protocol': 4,
},
},
'database': {
@@ -400,8 +393,6 @@ REDIS_PASSWORD = get_secret('redis_password')
# SECURITY SETTINGS
########################################################################
SESSION_COOKIE_SAMESITE = 'Lax'
# Tell the browser to never send our cookies without encryption, e.g.
# when executing the initial http -> https redirect.
#